A few months ago I made a post about comparing three web sites in terms of the usefulness of each for collating links of related materials. This is a slight expansion of my prior concept and map. Branches can be used not only to show positions (rank, size, weight) but also the “reasons” for the position.
As before, in reference to the original title “Pinterest Pins Scoop.it and PearlTrees,” I am referring to “pin” in the context of wrestling.
Pinterest, Scoop.it, and PearlTrees compete in the same space to be the best web-based way to refer readers of your blog, tweet stream, or web site to alternate sources of information.
Irv Oii is known to many international news organizations and researchers as a star data journalist. Being a home worker (although home may be the UK, Ohio, the Middle East, Central Africa, Hong Kong, or Antartica) and a fairly reclusive person, nobody seems to have met Irv. Some speculate that he might be a Jewish Asian-American. Others believe Irv is short for Irvelina, a Russian immigrant physician who went to Ohio (or was it Ojai, California) when the Soviet science programs collapsed and turned into the lower funded Russian collaborative efforts with the EU and USA. The collapse of the Soviet Union resulted in the closing of her laboratory in Minsk. Some even think Irv Oii is an acronym.
Irv is thus an enigma and no pictures of her/him seem to exist. An artist’s conception (mine) based on the writings and consultations of Irv Oii on healthcare breakthroughs is shown below. My belief is that a portrait of Irv should hang over the desk of every data journalist and researcher.
Big data this, big data that. Wow. At the end we will have better ways to sell underwear, automobiles, and “next day” pills (although in the latter case politics and religion might actually trump Amazon and Google). Blind empiricism. Every time you click a key on the Internet it goes into some big database.
“Little data” — lovingly crafted to test theories and collected and analyzed with great care by highly trained professionals — has built our theories of personality, social interactions, the cosmos, and the behavioral economics of buying or saving.
Big data drives marketing. Little data drives the future through generalizable theory.
I started writing about the importance of the content in the mind map — facts and important information well researched — back in November 2012. For the next few weeks I am intending to repost some of these posts with my updated thoughts about Mind Mapping 3.0 and what I would now call Mind Mapping 4.0. I will introduce Mind Mapping 4.0 after reviewing some of my views about Mind Mapping 3.0.]
It’s fine to put your own notes or feelings or ideas into a mind map that will be for your use or one which will be clearly labelled as you opinion. But, if you want to put ideas into general circulation as “facts,” you need to have done your homework and tie the information in the maps to established research, clinical findings, and expert opinion (and document whose expert opinion it is, whether that of someone else or yourself). Mind Mapping 3.0 was the introduction of high-quality data into this useful method of thinking.
I would categorize the pioneering efforts of Tony Buzan and others to introduce and popularize the method of mind mapping as Mind Mapping 1.0 and the parameterizations and resulting computer programs by ThinkBuzan, Topicscape, Mindjet, and others as Mind Mapping 2.0.
[As I saw it in 2012 and continue to view it in 2015] Mind Mapping 3.0 is the integration of computer-assisted mind mapping methods, artistic sensibility to enhance visualization, AND MOST IMPORTANTLY, substantive, creative, well-documented valid and reliable content of great importance.
The fictional detectives would have been great program evaluators. All looked at all types of data. Miss Marple was a model of pleasantry who could work her way into an organization or group and see it as it was without changing anything by observing. Holmes and Watson — whether in the original books and movies, the Ironman version of the movies, their current BBC incarnation in 21st Century London, or their CBS incarnation in 21st Century Manhattan with Dr John Watson now Dr Joan Watson (for the better) — use Holmes’ razor sharp mind and Watson’s intuitiveness and questioning. Sam Spade, wise cracks, an iron fist, and underlying sensitivity.
Program evaluation is not about conducting research, randomly assigning participants to conditions, or using quasi-experimental designs. Program evaluation is about understanding why programs produce certain outcomes, intended or not, positive or not, unique or not. To truly understand a program quantitative and qualitative data needs to be collected with great attention to the sensibilities, needs, risks, and potential confidentiality breaches of data of program participants, program staff, program administration, funders, and other stakeholders.
I love program evaluation. Every program is unique and at the same time representative of certain classes of human service organizations.
Be a detective. Look carefully and understand the beauty of a well-running program and how to help staff improve a program that is not working as well as it could.
This is the first of a series of posts I am making about program-organizational (and individual) evaluation. Much of what I will discuss is not in the mainstream of traditional program evaluation methodology.
My approach is different. It works.
In this first section the point is — obviously — that evaluation is iterative and nonlinear. This led to my first model that EVALUATION IS DETECTIVE WORK several decades ago. [Perhaps that explains my current obsession with all versions of Sherlock Holmes, whether in the original, present London, present New York, or by Iron Man.] At any rate, it seems ELEMENTARY to me that instead of thinking of program evaluation as a linear research experiment with a fixed design (a metaphor that works at best imperfectly), it is more important to treat evaluation as detective work where good rules of evidence must be followed and the evaluator is at fault if all outcomes are not found.
My initial development of the Detective Model in 1992 came from my observation that in much traditional program evaluation the evaluator applies a flawed “research” experimental model and the insensitivity of this approach means that a program looks worse than it is because the evaluation methodology is in error. Who pays for this problem? The program, of course, since the evaluator walks away saying that the “program sucks” and not that the evaluator screwed up. In the Detective Model, applied iteratively and nonlinearly, the evaluator and the program are partners, and it is clear what the responsibilities and level of success each has.
I have been writing (and mind mapping) a lot recently about the need to make sure that mind maps purported to contain “expert” information are valid, reliable, important, and data-driven. I have noted that I also think these mind maps are better communication devices if they are “organic” (in the sense of Tony Buzan) and “artistic” and creative. And I am fairly sure that valid and memorable organic mind maps can be much better for encoding information into memory.
The best example I have found of a profesional who consistently produces valid, reliable, important, data-driven, organic, artistic mind maps is Hans Buskes who posts his work frequently on his blog mastermindmaps and tweets as @hansbuskes. Dr Buskes’ maps have well-researched information that meets current standards of excellence, are easy to understand, and data-driven. Look at his two English-language e-books on mind mapping. The book available on iTunes is offered for free.
I view the work of Dr Buskes as the standard I hope to achieve.
The examples are partial screen clips of two of Hans Buskes’ maps. See the mastermindmaps blog site for the full maps and explanatory materials.
Content is Queen. The ultimate point of any mind map is to use and present information clearly in a way that communicates conclusions that are valid, reliable, and important.
Some examples. Are all of those mind maps floating around showing psychological variables and purporting to illustrate major findings and theories actually using valid information? (Guessing what all people feel like or how they learn and thinking it must be valid since, after all, you are a human, is probably not an indication that you are using highly valid data.) What is the expertise of the individuals who generated the information portrayed in the mind map? Was the information based on empirical studies, well-established theory, the musings of a pop psychology writer, what your Mom taught you, what your best friend thinks, what you saw in a movie? Did you (as a student or casual reader) just read a popular psychology book and accept what that person wrote on how you can be more rich, famous, happy, socially connected, sexy,and thin?
Much attention in mind mapping goes into the “artistic presentation” aspects of the maps, the colors, the rules, the images. And yes, prettier, neater, more original, and more creative maps are probably better received than those that use none of the great tools of visual thinking. But the reality is that the clothing does not make the person nor does the artistry of the map make the content more valid or reliable or important.
The first mind map below shows some of my thoughts and suggestions about how mind maps should be reviewed by experts in the content areas being addressed if the map will be used for purposes other than personal learning or process documentation or as art. That is, if the point of the map is to present facts, then the purported facts really need to be checked by someone who is an expert in the content area. In most cases, I have no problem with authors being responsible for their own work so long as they clearly state their own expertise levels and where the data for the mind maps originated. I have a big problem with someone who is not a trained mental health professional telling the world how to diagnose depression or ADHD. If the author of the map is not an acknowledged expert presenting her or his own work, then the source and limits of the information in the mind map need to be stated, and in some cases, independently evaluated.
The second mind map is actually just the first one produced in iMindMap exported into the alternative computer program MindNode Pro. Is the first map prettier than the second? Sure seems so to me. Is the first map more valid? No. It contains identical information. Does the first map communicate better than the second? Sure seems so to me.
Keep in mind that the goal of most mind mapping is to present valid, reliable, and important information in way that is easily understood, easily remembered, and easily communicated. Using this criterion the first map is probably significantly better.
The third mind map is identical in content to the two maps just considered but was generated using default options in the program XMIND. The style of the mind map is similar to that of another program (Mindjet AKA MindManager), and is that many argue is the best for presenting information to those in business.
Hopefully by the time you read this, you will have looked carefully at the actual content of the mind map in one or more of the variations. Content is Queen; it is all about the ideas. In the process of mapping, we need to incorporate references to the source of the information displayed. Pretty is good and memorable, but is not more important than the information presented. Content is Queen, although she does look better in a nice dress or business suit.
There are lots of different applications of mind mapping methods to such areas as brainstorming, task management, scheduling, journaling, and sharing basic information (great day to play basketball!). Other mind maps may tell us about scientific experiments and theories, political arguments, historical events, anatomical features of the human body, the quality of hotels in Barcelona, or expert rankings of world football (soccer) teams projected to finish near the top in the World Cup tournament. How do you know a real expert has ranked your favorite football teams correctly? How do you know that the student who created the cute mind map of the human body as a subway map actually put in the correct names parts and names? What are the professional qualifications of the “expert” who says the world is flat? Do experts believe the purported expert who drew the mind map? Is the information in the mind map you found and downloaded from the Internet really going to tell you what you need to know for your organic chemistry test in two hours?
I sure hope my doctors studied from factually correct mind maps, not just pretty ones given away by a pharmaceutical company. And (since I have a doctorate in psychology), I am really sick of seeing mind maps that say they contain psychological principles that will make you happier, thinner, less anxious, more sexy, and help you self-diagnose whether you have bipolar disorder and which drug would be best to help you and should be ordered from an Asian or Mexican pharmacy over the Internet (URL at the bottom of the map).
Mission critical information in mind maps should be carefully reviewed by experts in the content of the maps to minimize the number of cases where misinformation hurta people . If such a review has not been done, or if the author of the mind map does not provide adequate credentials to assess professional competence, I recommend you do not use such information for making personal or business decisions. While I love artistic maps that are well-designed and “clean” in their appearance and spend a lot of time trying to emulate the best, adherence (or not) to the mind mapping rules of Tony Buzan and the use of a wonderfully artistic program, in no way does or does not make the information in the maps correct. Think about that carefully the next time you download a mind map from the Internet and try to study or make a business decision; that’s a fact, Jack.
It’s also a fact that these comments also apply to infographics, concept maps, and other information visualizations.
My next post is going to have a lot to say about the importance of content and how to assess whether that pretty map you just found contains valid, reliable, and important information.
When you look at the web sites of mind mapping experts, you tend to see the beautiful maps they have drawn… about this thing and that thing and the other thing the expert six web sites away is saying today.
Information visualizations (mind maps, concept maps, Aunt Tildy’s homemade maps) are about presenting CONTENT in a way that makes it easier to understand.
Real information that is believed by content experts? Information that is reliable and valid? Information provided by credible sources (not Cousin Herbert after a frat party) such as well known polls and surveys, peer-reviewed journal articles, official government statistics, the web site of the guy who watchdogs all of the government statistics, and other credible sources speaking on or off the record?