If you have read this blog in the past few weeks, you know that I strongly support the notion of peer review of mind maps. However, I acknowledge it is not fair to to keep harping on this issue without providing some type of suggestions for implementing a system.
I selected BiggerPlate as my example because it is the largest and highest quality archive of mind maps I know of. I greatly support their work.
I believe that implementing such a system would increase the usefulness of mind map communication and advance this area of inquiry.
Click on the image to zoom in.
Thanks Liam. I am glad to see we are on the same page in terms of the quality issue and I am glad that I do not need to deal with either the implementation or cost issues.
You could recruit a number of reviewers like me to anonymously review SOME of the maps on your site. You would assign some kind of codes (R1, R2, R2D2, etc). You would assure your “readers” that these folks had appropriate professional credentials (content expertise, mapping expertise) and no commercial conflicts (in most areas of science this is an individual reviewer either stating that there are no commercial conflicts-of-interest, or if possibly somebody would construe something as a conflict of interest, state the specific activities that might be perceived as prejudicial).
Were I doing some of this reviewing–and I am neither committing to it nor seeking the job which I believe would probably be a conflict-of-interest because of my involvement with this blog–I think it would be more interesting to select the maps I wish to review rather than having some pre-assigned set. Also, you would need an editor primarily to make sure reviewer comments are not personal attacks on an author or motivated by some issue rather than the content of the map (competing commercial interest, bigotry, scientific bigotry, lack of reviewer due diligence, reviewer rigidity); this is not unlike what Amazon or Apple do with reviews of their products before allowing them to be posted on the Internet.
Some thoughts. I did find many of your comments quite compelling and I hope these suggestions are helpful. George
One further comment. I would categorize reviewers by:
a) area of professional expertise
b) experience in their profession
c) experience with mind mapping
d) current mind mapping programs and methods used
e) possible conflicts of interest
Hi George, this is a very interesting map, and certainly an idea with merit. Thanks for taking the time to write about this and share your thoughts. Here’s a couple of early perspectives from me:
At our recent conference in London, several participants discussed trying to generate better peer review of Biggerplate maps, based on their merit as ‘mind maps’, rather than their content accuracy etc. This was an interesting discussion and seems to overlap with your idea, although coming from a different quality lens – map format quality vs map content quality. The challenge that we shared with the conference group is that peer engagement in relation to a mind maps (indicated by ratings and comments) has historically been pretty low in relation to the number of views and downloads a map receives. For example, the top map on Biggerplate has over 64,500 views and 9,600 downloads. This map has (over 4 years) received just ten comments, and several of these are simple “nice map” comments rather than more in-depth review/analysis.
Admittedly you are proposing something a little more sophisticated than simple comments or ratings, but as an indication of the challenge we face, it is an interesting one. Many people come to Biggerplate, view/download a map or two, and think no more about it until the next time they visit, when they simply repeat the view/download process again! Comments and ratings don’t seem to feature in people’s actions that much. There are things we are doing to try and address this issue in itself, but it remains a useful perspective to consider.
An approach that might work along the lines you propose is to set up a panel of people who have the ability to rate/approve these maps as you suggest. If the criteria is to rate the content rather than the mind map, then this obviously comes with some interesting questions however: does a doctor have the time to scan through medical mind maps to approve them, or a teacher have the time to scan through maps for revision? If someone follows the information in a map and it leads down a bad road (bad exam grades, or medical malpractice), would any Dr, teacher, or panel member want to be held responsible for having ‘approved’ the map content or putting their name to it? If someone is willing to do these things, would they not expect to be paid something for the service? Finally, how does Biggerplate quality check the quality checkers?!
These are questions we talked about when we first started Biggerplate in 2008, with the original idea of creating a teacher-reviewed mind map library for schools. Back then, the sheer complexity of the potential scenarios meant we would never have got the site off the ground if we had tried to ‘cover all bases’. While the site is now up and running well, the complexity of scenarios remains, only now the risk would be that it consumes all time and resources for the sake of a handful of peer reviews (going back to my earlier point). It is therefore a hard thing to even contemplate taking on in addition to the numerous other projects we are pushing forward at the moment, but certainly not something to write off entirely either now or in the future.
Importantly, I should re-iterate that I think the theory is a good one. It was (as I say) part of our original thinking for Biggerplate, and it would indeed be great to have content that has some form of official approval and content quality control. However, as yet, I cannot fathom a way of approaching this that does not start to look quite complex quite quickly! I hope that others may engage with this conversation and share their own perspectives, and perhaps this might push the thinking beyond the limits of my own brain (not hard to do…!)
Thanks again for this idea. It would have made for an interesting conference questions at Biggerplate Unplugged!
Do you invest time and money building a more sophisticated peer-review system only to see it used by a handful of people,