Info

social, health, political imagery through the lens of George J Huba PhD © 2012-2017

Search results for big data

screen_0089 screen_0088 screen_0087 screen_0086I consider John W. Tukey to be the King of Little Data. Give him a couple of colored pencils, the back of a used envelope, and some data and he could bring insight to what you were looking at by using graphic displays, eliminating “bad” data, weighting the findings, and providing charts that would allow you to explain what you were seeing to those who had never been trained in technical fields.

Tukey’s approach to “bad data” (outliers, miscodings, logical inconsistency) and downweighting data points which probably make little sense is what will save the Big Data Scientists from themselves by eliminating the likelihood that a few stupid datapoints (like those I enter into online survey databases when I want to screw them up to protect privacy) will strongly bias group findings. Medians are preferable to means most of the time; unit weighting is often to be preferred over seeing too much in the data and then using optimal (maximum likelihood, generalized least squares) data-fit weighting to further distort it.

Few remember that Tukey was also the King of Big Data. At the beginning of his career, Tukey developed a technique called the Fast Fourier Transform or FFT that permitted fairly slow computing equipment to extract key information from very complex analog data and then compress the information into a smaller digital form that would retain much of the information but not unnecessary detail. The ability to compress the data and then move it over a fairly primitive data transmission system (copper wires) made long distance telephone communications feasible. And later, the same method made cellular communications possible.

Hhmm. More than 50 years ago, Tukey pioneered the view that the way to use “sloppy” big data was to distill the necessary information from it in an imprecise but robust way rather than pretending the data were better because they were bigger and erroneously supported over-fitting statistical models.

Hopefully it will not take another 50 years for the Big Data folks to recognize that trillions of data points may hide the truth and that the solution is to pass out some red and blue pencils and used envelopes. Tukey knew that 50 years ago.

All it “costs” to adopt Tukey’s methods is a little commonsense.

Hhmm, maybe the Tukey approach is not so feasible. Big Data proponents at the current time seem to lack in aggregate the amount of commonsense necessary to implement Tukey’s methods.

Turn off the computers in third grade, pass out the pencils, and let’s teach the next generation not to worship Big Data and developing statistical models seemingly far more precise than the data.

John W Tukey

Big Data (in service to the NSA) wants to be able to document what you do and when and where and with whom. All of the current databases that companies and public agencies maintain can now be tightly linked to get a pretty good profile of any individual.

But, these models of what people will do when you ask them to buy a DVD of Thor 2 or a suit from Brooks Brothers, are actually fairly dumb brute force computer algorithms that break down when certain types of problematic data are fed into them.

Hhhhmmm. Some thoughts below in the mind map. Click the image twice for a full expansion.

SCREW UP BIG DATA BROTHER

Banks and online merchants use fairly sophisticated algorithms to identify probable cases of financial fraud and then protect themselves from the consequences of lost or stolen credit cards, etc. One of the most prevalent forms of elder abuse is financial. Aging adults are attacked by predators trying to get them to refinance their homes with reverse mortgages at exorbitant rates; make huge gifts for “kindness” from strangers; and one scheme after another. Sadly, much of the financial abuse is perpetrated by family members. And predatory financial scams are often targeted at aging immigrants to the US. Instead of just checking credit card records for fraud so as to protect themselves from liability, banks could use the same types of algorithms to scan withdrawals from savings and brokerage accounts as well as charges to credit cards to determine if they are atypically large for someone in their 80s.  (At least in California) Banks are mandated reporters (to law enforcement) of suspected financial abuse of elders. Wouldn’t it be nice if banks used the algorithms they already use to protect themselves (at the expense of your privacy) to at least protect older individuals (at a loss of the privacy they already gave up when they opened accounts) from the scum who try to separate cognitively impaired or depressed seniors from their lifetime savings? Wouldn’t that be nice …..

image

smiley10

Big Data/Data Science 1

Big Data/Data Science 2

Big Data/Data Science 3

Big Data/Data Science 4

Big Data/Data Science 5

Big Data/Data Science 6

Big Data/Data Science 7

Big Data/Data Science 8

Big Data/Data Science 9

Big Data/Data Science 10

Big Data/Data Science 11

Big Data/Data Science 12

Big Data/Data Science 13

Big Data/Data Science 14

A few thoughts about the importance of knowing the theories and prior studies in the content area of the modeling and data collection and data analysis and generation of conclusions.

You can’t model data without knowing what the data mean.

Click on mind map to expand.

Data Scientist

We have had many data science fields in the past 50 years. Among others, the fields include applied statistics, biostatistics, psychometrics, quantitative psychology, econometrics, sociometrics, epidemiology, and many others. The new emphasis on data science ignores content knowledge about the data and their limitations and the permissible conclusions.

We do not need to replace a round wheel with a square one.

See also previous post on Big Data/Data Science adopting the mistakes of Big Pharma.

a HubaMap™ by g j huba phd

Dec 13 2013: I have been experimenting with some formatting. This is the same map content as above, but using iMindMap 7 which was recently released.

Data Scientist sketch

Can Big Data/Data Science avoid the train wreck of Big Pharma? I believe that the Big Data disaster will make the Big Pharma issues seem small in comparison.

But the issues will be about the same. A lot of the Big Pharma execs have become quite skilled at “beating the system” using “undocumented science” and many will move to Big Data and employ all of their very “best” moves and tricks. Big Data/Data Science has the potential to hurt the average individual even more than the greediness of Big Pharma.

Big Pharma

Big Pharma Train Wreck

Big DataBig Data Train Wreck

Help!!!!!

HubaMap™ by g j huba phd

This afternoon I went to the local Panera and paid by credit card. My bank declined my charge of $4.82. I figured it was the magnetic strip on the card which had failed or that the new trainee using the cash register may have made a mistake. She ran the card three more times and it was rejected. Then I got four text messages from the bank saying that they are rejected my charges. To text me, they used my phone number.

I called. They had put a hold on my card because they had some questions about my charges from the prior few days. The red flag event was that I had made an earlier charge of $9.65 at Panera about eight hours before. Their computer program was not smart enough to figure out that it was not unreasonable for someone to have breakfast at 6:30am at a Panera in Durham and then walk into a Panera in Chapel Hill later in the day with 30 minutes to kill and had a coffee (and a Danish I probably should not have had) while I played with my iPad on their free wireless connection. The computer also questioned the $1 charge at a gas station this afternoon (which the human representative immediately recognized as the established practice of gas stations opening charge lines with their automated payment systems of $1 when you swipe your card and then next day putting a $92 charge on the card for filling the tank). I was also asked if the payment made on the account was one I had made (I asked the customer service rep if she thought that if someone had paid a bill for me that I would tell her it was an erroneous transaction and she laughed for a long time) as well as a $71 charge to a software company outside the US.

They had freaked out because they could not reach me by phone at three numbers that were old ones not active (I know they have my current number because they sent me texts at it and same bank sometimes calls about my other accounts at the cell phone I never turn off and which has a voice mailbox). Of course, if they did not have a no reply text address, I could have responded to the four texts they sent.

Predictive models have been around for a decade or more in banks as they attempt to identify fraud and protect themselves. The episodes I have with my bank about every 2-3 months illustrate what happens when somebody blindly runs predictive analytic programs through big datasets without using some commonsense to guide the modeling process. Just because anyone can buy a $100,000 program from IBM or others for developing predictive analytics does not mean that the model that comes out of the Big Data and expensive program makes any sense at all.

Or that the NSA or FBI or CIA or Google or Amazon models make much sense as they probe your private information.

If a computer predictive system is going to think that somebody is committing credit card fraud because they purchase two cups of coffee at the same national restaurant chain in a day, we are in big trouble.

The bottom line is that Big Data models are going to have to be regulated before some idiot accidentally turns on Sky Net.

Or maybe the problem is that the NSA or FBI or CIA or Google has done it already.

 

Click on mind map to expand.

academia and  healthcare  big data

Splatter10

9

BIG Data is coming (or has already come) to healthcare. [It is supposed to usher in new eras of research, economic responsibility, quality and access to healthcare, and better patient outcomes, but that is a subject for another post because it is putting the carriage before the horse to discuss it here.]

What is a data scientist? A new form of bug, a content expert who also knows data issues, an active researcher, someone trained in data analysis and statistics, someone who is acutely aware of relevant laws and ethical concerns in mining health data, a blind empiricist?

This is a tough one because it also touches on how many $$$$$ (€€€€€. ¥¥¥¥¥ , £££££, ﷼﷼﷼﷼﷼, ₩₩₩₩₩, ₱₱₱₱₱) individuals and corporations can make off the carcass of a dying healthcare system.

Never one to back away from a big issue and in search of those who value good healthcare for all over the almighty $ € ¥ £ ₨ ﷼ ₩ ₱, here are some of my thoughts on this issue.

Click image to zoom.

who is a health data scientist

Content knowledge by a well-trained, ethical individual who respects privacy concerns is Queen. Now and forever.

Keyword Board

topics and subtopics: who is a “health” data scientist? trained in healthcare? methodology research databases management information systems psychology? psychometrics other public health? epidemiology other medicine? nursing? social work? education? biostatistics? medical informatics? applied mathematics? engineering? theoretical mathematics? theoretical-academic statistics? information technology? computer science? other? conclusions must know content 70% methods 30% must honor ethics 100% laws practice privacy criminal civil federal state other greatest concerns correctness of results conclusions ethical standards meaningfulness validity reliability privacy utility expert in content field data analysis data systems ethics and privacy other member? association with ethics standards licensed? physician nurse psychologist social worker other regulated? federal hipaa state other insured? professional liability errors and omissions continuing education requirements? ethics renewal of licensure regulatory standards insurer commonsense laws go away if not well trained content field data analysis not statistics committed clean data meaningfulness subject privacy peer review openness ethics ethics ethics are arrogant narrow-minded purely commercial primarily motivated $$$$$ blind number cruncher atheoretical © 2013 g j huba

I wouldn’t go on a bus trip with a driver who is unlicensed. Would you?

Who is driving the Big Data bus? Data scientists? Mindless algorithms? Content experts and their teams of data scientist support staff? Marketing? Security firms (including those run by governments)? Terrorists?

I say this once, I will say this a million times … Content is Queen.

Algorithms that are primarily empirical without an understanding of the validity of the data being analyzed and the theoretical issues are dangerous.

An algorithm can predict — and I have no doubt several are doing so at this minute — how happy I will be on a global question (how happy are you?) or a behavioral index (at a sporting event, at the bank cashing a check, four days after the death of a parent) or the perceptions of others (just got tagged in somebody’s photo, got mentioned in a tweet, had a happy blog entry, had  birthday, just had a child born, got back a favorable medical test result, used a smiley face).

I have observed and analyzed and proposed new ways of measuring “happiness” and “anxiety” and “grieving” and “intelligence” for 40 years. I don’t really know what “happiness” or “anxiety” or “grieving” or “intelligence” is although I do know a lot about how experts have tried to define these constructs. I do know that a blind algorithm is not going to answer the question of what “happiness” is.

Do you want an algorithm driving the bus or someone who knows the limits of current data? I don’t want a blind algorithm predicting whether I am “happy” (and happy enough to buy something). I don’t want a blind algorithm predicting the economy. I don’t want a blind algorithm predicting how many healthcare visits I should receive under health insurance.

Content is Queen. The algorithms that drive the organization of Big Data need to be guided by content specialists (psychologists, sociologists, physicians, nurses, economists, physicists, chemists, bioelectrical engineers, etc.) not data scientists without expertise in one or more of the relevant content fields.

If the Queen rules, all will probably be well in the kingdom. If blind algorithms rule we probably will end up as batteries in The Matrix.

I vote (before it is too late) for the monarchy of content. I am not a battery.

candy 5codeHubaisms

Evaluation 4

Aaahhh… GiGo (garbage in/garbage out). The GiGo phenomenon haunts data analysts, statisticians, researchers, theorists, and someone who loses their identity.

So these huge [health] datasets we keep hearing about … who controls them? what is their validity? reliability? utility? who else gets to see them?

And the data mining algorithms… proprietary or public? based on which tests and algorithms? who developed? who validated? are the methods valid? reliable? have utility?

And the results coming out of big data and proprietary data mining algorithms… reliable? valid? useful? clearly interpreted? limitations stated? misinterpreted?

Is big data and data mining about using world-wide data to find solutions to some of the world’s problems or to sell more books, videos, and cola?

I don’t think anyone really understands the big data sets and their limitations. I doubt that more than a small percentage of the data mining algorithms are valid. I sure as hell do not want somebody blindly using these algorithms on data they do not understand and then helping the government limit healthcare visits for high need, low resource individuals (sound familiar to anyone?).

An experienced statistician-data analyst-methodologist knows that when analyzing a large data set you must spend 98% of your time looking at (and fixing if possible) bad data points. The final 2% of your work is then much more likely to show something that is reliable, valid, and useful.

Big Data may save us, or it might kill us first. Or it might make us Borg or batteries.

No mo …. GiGo. [Is Nicki Minaj available to record this mantra?]

splines

Big data this, big data that. Wow. At the end we will have better ways to sell underwear, automobiles, and “next day” pills (although in the latter case politics and religion might actually trump Amazon and Google). Blind empiricism. Every time you click a key on the Internet it goes into some big database.

“Little data” — lovingly crafted to test theories and collected and analyzed with great care by highly trained professionals — has built our theories of personality, social interactions, the cosmos, and the behavioral economics of  buying or saving.

Big data drives marketing. Little data drives the future through generalizable theory.

Click on the figure below to zoom.

in praise of little data

Look around at the restaurant or on the subway or on airplanes or at bicycle riders (yup, see it a lot around here) or at store workers or person in the car next to you at the red light or in television shows and at businesspeople, teens, tweens, older adults, hospital patients, hospital doctors,  athletes, the disabled, those wearing the most trendy clothes and those dressed in all black with black hats/scarves. Data is streaming into all of their lives: email, texts, videos, music, e-magazines and e-newspapers, web sites world wide, Twitter, Facebook, Instagram, and the local restaurant’s menu. Netflix, iTunes, Amazon Prime, your bank, your doctor, your pharmacy, your local fast food purveyor, extra news and feeds from the sporting event you are attending, the latest Kardashian kamikazi komedy.

The video game is the work of the Devil.

With the exception of an increasingly small percentage of individuals with unlimited data because they were early adopters and have not changed their cellular plans, most of us are paying by the gigabyte. Those with free plans are throttled so that they really cannot use an unlimited amount of data for a fixed price so the fixed prices will go away soon.

Drop data prices, streaming will expand exponentially, the phone companies will make even more money, you will never see your friends in the flesh anymore, family dinners as we knew them in 1960 or even 1980 will be dead and replaced by family members sitting at the same table eating junk food and each watching their own data stream, and no one will want to go to the movie theater or red box anymore. Even the Columbian cocaine lords may go out of business.

Data overload will lead to data addiction and probably result in humanity evolving into the Borg Collective.

Borg

We need to make some changes before Skynet and the Terminators become inevitable.

skynet


terminators

I think the human race has no more than 30 years to evolve before the bytes take over. It will make the “War on Drugs” seem like the good old days and war with the Cylons inevitable. If you thought Big Pharma was going to control your life by promising the end to pain and disease, think again. Big Wireless will be even more insidious and the way Big Pharma has increased healthcare costs significantly will turn out to have been smaller than wireless when the historians look back in 100 years. Wireless data streaming is already starting to become the crack of the next decades.

Turn the Devil’s toys off when you: go home, go to dinner, watch TV, are in a meeting, are in a class, are in a place of religious observances, go on vacation, go to bed, take a shower, go into the bathroom (yup, your screaming boss may be in a toilet stall at DFW or ORD), or go to a friend’s home. Get out of the habit of pulling your cell phone out to take a picture of your family and then checking your email or Twitter account while you are at it. And stop modeling the “cellular data comes before everything else” lifestyle to your kids.

Even Spock turned the data stream off sometimes. Do so and “Live Long and Prosper.”

 

6

The focus of the blog is on the issues shown below. If you click on the image, it will expand.

hubaisms

Click Links Below for Selected Posts

Dementia

Healthcare

Healthcare Reform

Mind Maps/Mapping/Models

Huba’s Integrated Theory of Mind Modeling/Mapping

Writing in Mind Map

Case Management

Self Care

Caregiving

Mental Health

Visual Thinking

Computer Program Reviews

Frontotemporal Dementia

Alzheimer’s Disease

Cognitive Decline

“Normal” (Typical) Aging

HIV/AIDS

Big Data

Statistics

Politics

Personal Story (g j huba phd)

Universal Human Rights

Stories from a Lifetime

Hopes and Wishes

Personal Favorites

Hubaisms Blog – WHY?

ALL

Most people carry around a lot of assumptions about what other people should be able to do.

We typically assume that if you can write a blog post you can tie your shoes or feed yourself ice cream.

Or that if you cannot remember names or understand a simple conversation you cannot mind map. Or that if you can mind map you must obviously be able to make a decision about what clothes to pack for a two night trip.

Well … I can do a pretty complicated — and I think fairly creative — mind map in an hour or two that illustrates a pretty good conceptual understanding of scientific, psychological, or emotional material. I takes me two FULL days of high anxiety to pack a suitcase for a short trip and I often arrive with clothing unsuited for the intent of the trip or the weather. I remember a lot of multivariate statistics and probably could still analyze a complicated BIG DATA set, but have had times when I had to do Google searches to spell arithmetic correctly.

Doesn’t make any sense except to a skilled neurologist. And every person with dementia is different and every disease that results in dementia is different. And sometimes you can do things in the mornings that you cannot do later in the day.

Don’t let your perceptions and assumptions stereotype people with dementia. We can — depending upon the specific person — do a lot of things you believe we cannot do because we leave a shirt buttoned and pull it on over the head since buttons are too frustrating. And just because I can make a mind map does not mean I can button my shirt or make it clear to a server what I want for lunch.

Go figure.

I can however remember how to eat ice cream with a spoon. And I am pretty sure I will never lose that knowledge. But I am writing complete instructions for myself just in case I cannot figure it out in the future. Some things are too important to leave to chance.

Click on the image to expand it.

DO NOT ASSUME

I corrected a huge mistake in my thinking about mind maps during 2010.

I had started using the program Mindjet MindManager for mind maps at the time version 2 of the program was released. Over almost 20 years I used occasionally used MindManager, alternating periods of a few days of intensive use with months of ignoring mind mapping.

I hardly considered organic mind mapping in the early days because: a) I cannot draw clearly or even print clearly even though Tony #Buzan says everyone can; b) I am a “tech guy or nerd” and damn it, why would I hand draw something if a computer program was available to turn my brilliant thoughts and words into pictures.

Secondarily, how could I possible use wavy lines with labels in all kinds of orientations and colors best-reserved for a child’s coloring book or a circus? I worked with groups of federal/state health policy makers, physicians, psychologists, social workers, nurses, counselors, grant funders, politicians, and public advocacy groups. Colors that looked like they came from a crayon box and drawings that looked like they were drawn by a second grader would be seen as childish, silly, not useful, and (most importantly) disrespectful by a group of senior professionals in the health/social care areas.

Idiot.

I bought every upgrade of MindManager over 20 years. Those upgrades were pretty expensive for a small consulting firm charging public sector fees less than half of those of private-sector companies.

I had strong misgivings about the MindManager mind maps I presented in meetings about HIV/AIDS services, research designs, elder abuse, optimally training geriatric nursing leaders, statistical analyses, and the many related topics I worked on during my career. Nonetheless I kept presenting the maps and using them in written reports.

I came to the conclusion that the method of mind mapping was primarily a way of presenting outlines in a somewhat novel way that introduced a lot of “white space” into diagrams typically plagued with too many words on a boring and ignored PowerPoint slide. Business executives liked the MindManager approach since it was in their comfort zone (outline in a picture).

I was becoming a Bleeping Idiot for continuing to use MindManager style Outline Mapping.

2010

I read about the iMindMap program in a variety of tweets from individuals I followed on Twitter and started trying the program and then reading much of the collected writings of Buzan; I watched some of the YouTube videos derived from his telecasts.

I thought organic mind mapping was kind of cool. It interested me at first because it would lead to presentations that were far more interesting than the ones with PowerPoint I suffered through 100 times a year (and gave myself to large audiences at least 50 times a year).

A couple of months later I decided that I would give an entire presentation (and the final report) using iMindMap 5 maps to a group at the US Health Resources and Services Administration, the major US government agency for financing public healthcare clinics and programs (and especially those targeted to HIV/AIDS services).

The project was to develop a framework for teaching program managers of US-funded, locally-administered African projects on increasing the number of nurses trained in and providing clinical services for treating HIV/AIDS. The topic was about program evaluation theory and implementation. Program evaluation can be a very technical area dominated by methodologists who speak “numbers” not concepts, acronyms, and is often perceived as excruciating by its participants.

The meeting was with two senior federal grant administrators and USA-funded program managers and service providers, half from the Columbia University (USA) and half from Africa who were part of a six African-nation collaborative team.

I developed a dozen pretty large mind maps on evaluation goals and results, ways to conduct the evaluation and why, how to improve services using the results, respecting clients, and other issues including ethics and reporting results to the funders. The general topics were ones I had discussed with hundreds of groups in the prior 20 years.

All of the mind maps were developed in iMindMap using circus colors, curves, cartoony clip art provided in the program, font coding, and a nonlinear organization. I wanted to animate the presentation by jumping around the map “automatically.” This was before mind mapping programs in general (and iMindMap specifically) included presentation animations. At the suggestion an expert on visual thinking, Roy Grubb (a Twitter buddy from Hong Kong — @roygrubb), I used the program Prezi to animate the jumps around the map into to what could be a presenter-guided talk or a self-running kiosk video.

To say that the presentation was well received by the audience of program managers, senior policy makers, and medical professionals from the USA and various African nations) would be a gross understatement. The presentation was praised, a couple of physicians said this was the first time they really understood what evaluation was, and perhaps more concretely, the participants insisted on having the one-hour presentation evolve into a two-hour greatly interactive and animated group problem solving session that pissed off the US State Department because the participants arrived to their meeting at State an hour late. The evaluation for the next five years of an extremely large funding program in Africa on HIV/AIDS treatment capacity was altered. A subsequent program evaluation project for the African project was funded to our company.

I was just presenting the same-old/same-old conclusions I had evolved over two decades. But the information after I reformatted it into a #Buzan style mind map using the iMindMap program forced me to re-think the overall system of evaluation I believed in so as to prepare a liberating and valuable experience for the audience. The new mind maps were nonlinear THEORETICAL MODELS accessible to individuals with training neither in program evaluation nor mind mapping.

By contrast, the old way i would have presented the same information in MindManager or as bullets in PowerPoint was as nothing more than a formatted outline (or what I now call an Outline Map) and my thinking and that of the participants would not have gone in such creative directions.

I was pleased to find out that one of the meeting participants had been trained in a workshop by Mr. Buzan and that she felt that the presentation mind maps were the most Buzan-like she had seen since the training.

The hundreds of mind maps I have made for this blog have reinforced the conclusion I reached from that HRSA meeting on HIV/AIDS that computer-assisted, Buzan-style organic mind maps and visual thinking methods are far superior to the “traditional” linear methods that are forced by some computer programs that do not encourage Buzan-style thinking and mapping.

Bright colors, contrasting fonts, curvy lines, cartoon graphics, one word per branch, nonlinear organization …

I joined the Circus.

Big Data Train Wreck DSM5 Tournament huba's laws of  mind mapping

No, I haven’t lost “it” and this is not a science fiction movie.

With the unleashing of big data, big computing, big temptations, and big greed, it is going to be real tempting to develop a George Huba (or Bill Smith or Mary Doe or heaven forbid, a George Bush) computer model that can fairly accurately predict from my lifetime experiences whether I will buy a new car next year (and what type and in which cost range and maybe from which car dealer), purchase or sell a home, shift from converse to adidas sneakers, become emotionally distressed if I do not have chocolate, and purchase Apple stock. Or run a simulation of me as the CEO of a particular company to determine if I get the job. Or look at my medical history and determine whether it is likely that my grandchildren will have each of 20 expensive diseases that no insurer wants to touch.

Already the IRS runs programs to estimate the likelihood I cheated on my income taxes, Amazon runs programs to estimate the likelihood I will purchase certain books and socks before or after the December holidays, and my credit card company runs models to determine whether it is likely or not that I purchased shoes while on a business trip (yup, they once froze my credit card because their computer model says I only buy sneakers).

OK, so the accuracy of the big data scientists is only something like 20-50% now. What do you think it will be when your book purchasing history is integrated with your job history, income, ice cream purchases, pharmaceutical purchases, and BMI? And then fine tuned with the grade you got in college English, chemistry, or psychology; whether you had a hiking or a beach vacation; if you purchased (used) sunscreen and had a history of purchasing sun hats; the diseases that all four of your grandparents and parents had at different times in their lifetimes. And whether your car is more than 3 years old. And what do you think it will be when we create a generation of data scientists willing to capitalize on huge data to build such models for salaries that will approach those of professional athletes and rock stars?

Ten years from now, the computer models produced of selected individuals will make Mark Zuckerberg, the Google guys, and Jeff Bezos look like rank amateurs in profiling.

[Oh, and by the way while writing this post Google knows that I looked up Mark Zuckerberg’s name and the spelling of adidas.]

I want to tell anyone that wants to develop a mathematical, computer model of me (or my behavior, beliefs, attitudes, skills, history, and future intentions) to cease and desist. Or fuck off.

Which raises the questions … Do I own the copyright (patent, trademark) to my own life? [And if I do, what are the limits and will violations of those laws by a number of countries be ignored?]

This is not so far-fetched. I spent my whole life becoming the person I am. Does anybody have the right to take all of the big data about me and distill my life down to formulae and algorithms that will explain my past and current behavior and predict what I will do in the future? Should people be allowed to model individuals, I fear that the suicide rate will go up dramatically as people find out how much these models can be used to control them.

As a psychologist, I spent my career studying people so that we might better understand their fears and concerns, help them better use their full potential, become happier, control their own aggressive or violent tendencies, and generally become the people THEY WANTED to be. And I, and no other ethical psychologist, struck out with the intent to model the behaviors of others so well that the resulting models could be sold to governments and corporations.

Big and huge data, data scientists, companies, and governments need to be prohibited from violating the rights of individuals to “own” their individual lives. If we ever let others “own” our individual identities, we will have crossed into new territory from which there is no return. The technology is almost there to create such individual mathematical models.

I was endowed by my creator to own the copyright, patents, and trademarks of my own life… and to answer for what I chose to do with that intellectual property (free will). I choose not to sell my soul to the devil.

A few more thoughts are in the mind map below.

Click on the diagram to zoom.

copyright patent my life

The majority of my blog posts include mind maps or alternate graphics. Please look at the blog post archive for hundreds of examples. Also note that since is really just a “sampler” gallery that the blog posts may contain more current versions of these maps, and in most cases there is an explanation of both the content and mapping methods in the individual posts.

This page is intended to be a sampler of mind maps in various blog posts. This is not an inclusive list and there are far more mind maps in the blog posts than just the ones included on this page.

 

MapsOfTheMindPS

 

I am in the process of updating the page so that each image is linked to the blog post from which it was copied. At this time the process is not complete; the images are being linked in order from top to bottom of the page, but the process is incomplete at this time.

Click on images to zoom.

HOT AND COLD  MIND MAPS2 HOT AND COLD  MIND MAPS 2015_05_01_20_04_04 Barriers to Dementia Care 3dincreasing  worries  as dementia  progresses increasing  worries  as dementia  progresses Every Day  I Use  Mind Maps  to ... I am very slow  from dementia i doodle because Tweeted  Mind Map  Guidelines Tweeted  Mind Map This is Now 2 WHY I  PREFER  BUZAN ORGANIC  MIND MAPPING  FOR USE WITH  PEOPLE WITH  DEMENTIA Training  Dementia  Caregivers  and Healthcare Providers to  Mind Map bullshit analysis 2015 T Map Styles ARE  Very Important3d gh3Imindmap 8.0

george huba  the summary

mind maps  may help  cognitively  impaired ...JULY 4 THOUGHTS  FOR A BETTER  USA
types of dementia

 

Huba's Laws OLD format

 

MIND  MAPS final Oddz and Endz Jan 2014 MLK do not let kids  get addicted  to e-tablets  in school 360 INPUT FOR PROGRAM EVALUATION Fonts, Colors, Styles 10 2CREATE THE  PERSONAL INTERNET  TO SUPPLEMENT  YOUR MEMORY

note to self 2014 10 GREAT UTILITIES FOR MAC 2013 G HUBA'S TOP MAC APPS OF 2013 3d G HUBA'S TOP MAC APPS OF 2013 OFFICIAL AND UNOFFICIAL REASONS PAPERS ARE ACCEPTED FOR PUBLICATION IN PEER-REVIEWED HEALTH-MEDICINE JOURNALS Synergy MM SN SCREW UP BIG DATA BROTHER Data Scientist sketch World's Most Famous Recipe

2MV  P

20131121-193030.jpg

possible  side effects winter SEg SEf possible  side effectsBOXES possible  side effectsBEST possible  side effects7 cpossible  side effectsC Voila_Capture90

Universal Human Rights

United States POTUS 3D final

ExpoBoardExample

Mow Lawn When ... 3d

Coding System for Archstone Foundation Progress Reports. Elder Abuse & Neglect Initiative (2)

Clinical Trial  Double-Blind  Treatment Evaluation  in the Era of the Internet

How YouTube Can Be Even More  Helpful to Those Impacted  by Rare and Orphan Diseases

CODER Algorithm for Mind Mapping

BAGNC

A MM9

Types of Dementia

Kids with iPads

trout4

ELIMINATING AIDS FROM THE PLANET  G J HUBA PHD  JULY 2012

image

3 iPad Alternatives to Scapple (MacPC)

Maps To Explore Options Under Various Scenarios

Some Feelings During Cognitive Decline to Dementia

Workflowy Data and iMindMap

Class Projects on Visual Health Education-Prevention Information iPad mailbox app huba's laws of  mind mapping

Huba's Laws OLD format

what neurologically-impaired individuals might gain from mind mapping

test maps on board

CONSULT2

want

Don't Believe a Psychology (Self Help) Mind Map Unless it Tells You

academia and  healthcare  big data

United States Presidents xmind

The Program Evaluator's Tool Kit

photo

IdealMindMappingScenario

Fig 2 Final

United States POTUS 3D final

 

This is the personal blog of George J Huba PhD. I was trained as a research psychologist, have 35 years of experience in research and program evaluation of healthcare models, and was diagnosed with a neurodegenerative disease in 2010. Since my medical early retirement in 2011, I have focused my personal research on evaluating and developing inexpensive visual thinking methods (such as mind mapping/modeling) for those with cognitive decline, dementia, typical aging, or for adults who wish to minimize future cognitive decline. Having professionally worked with several thousand health- and social-care professionals over 35 years, my work is informed by the dozens of disciplines working on neuroscience research, patient care, aging, caregiving, and healthcare systems development.

6

The focus of the blog is on the issues shown below. If you click on the image, it will expand.

hubaisms

Click Links Below for Selected Posts

Dementia

Healthcare

Mind Maps/Mapping/Models

Case Management

Self Care

Caregiving

Mental Health

Big Data

Politics

ALL

I doubt that there are many people expert in mind mapping who would disagree with me that iMindMap is the most feature-laden of the more than 100 programs for mind mapping to be found all over the Internet.

Once a year — as promised when the program was first introduced — iMindMap has a new release that provides many new features and usability enhancements. And unlike others, they produce a great upgrade every year on time. And free from most bugs that live in Cupertino and Redmond.

How good is iMindMap 10?

Click on the mind map (actually mind model in my terminology) below to expand its size. For those of you with no patience or dramatic sense of the big build-up, you can skip directly to the “9” branch. iMindMap is the 8,000 pound gorilla.

As a note, my review was conducted about six weeks after receiving the program and using it exclusively rather than earlier editions. I use a Mac only, and my review was conducted on a 2013 Macbook Pro. I have worked with the program both on an internal 15″ retina macbook screen and a 27″ external monitor. [I actually like using the Macbook screen better.]

imindmap-10-review

Chris Griffiths and his team at OpenGenius have taken the work of Tony Buzan and in the process of developing a program expanded and formalized that conception in a creative way that is brilliant in its overall utility and ease of use. iMindMap 10 is my favorite mind mapping program, but most importantly my favorite and most useful thinking tool. For those of you who do not follow my blog in general, I live with Frontotemporal Dementia and iMindMap has served as a “brain assistance tool” for me since 2010 in daily living and in continuing my professional interests in a creative way. I can accurately say that the various versions of this program “changed my life.”

This is a tool formulated by expensive consultants who want to help corporations make more money while at the same profiting from that help. But the tool has come to greatly exceed the original vision and is intuitive to use and most adults and all children can learn to use the program for free using Internet trainings. Don’t be scared off by all of the publicity about a $3500 training and a certificate signed by a consulting firm (not an accredited educational institution). You do not need a course to learn this program and it is not clear to me that expensive courses help you learn to apply this program in the real world. If you are willing to invest a few hours you can be doing adequate mind maps; if you invest 10-20 hours you can be doing accomplished mind maps.

Get over the hype and realize that you CAN learn this program quickly on your own and even more rapidly if you study examples available without cost at many blogs including this one (Hubaisms.com), a depository of many thousands of mind maps at Biggerplate.com, and many other sites including youtube.com where many training sessions are presented.

While there are four “views” in this program, the primary mind mapping module is the reason for using this program. The other three views are largely alternate ways of looking at the same information and data. While they may be “quicker” ways to collect information together from a lecture or library research, at the end they feed their data into the mind mapping module where the actual thinking work, theory building, model development, and communication is done.

I have a few criticisms of the program, but these criticisms do NOT change my overall rating of the program as A+.

  1. The time map module is really just a Gantt chart of interest to but a few mid-level corporate managers and high level executives who have not yet adopted better ways of team management. As a Gantt chart the module is fine, albeit about the same as most existing software in that area. Unless you are like a friend of mine who manages 10-year projects to send landers to Mars with 10,00 team members, I cannot imagine why you would want to use a Gantt chart.
  2. In my view and that of many other potential users, a “time map” is actually a timeline that incorporates mind map features. While others have tackled this issue (most notably Philippe Packu and Hans Buskes), my formulation was the original. The resulting blog post (click here for a new window) has been the most read one about mind mapping methods on my blog site for FOUR years. I’d urge the iMindMap developers to look at my model of time maps which requires a lot of custom work that I am sure they could easily automate.
  3. For almost all mind map users, the future is using pre-made templates designed by content experts. Purchase a template package and then you can then create your own mind maps by adding your information to the pre-designed expert map for your area whether it be healthcare or project management or writing a term paper or designing a research project or selecting the right clothes for a 5 day business trip. At this time iMindMap does not yet have a way of protecting the intellectual property of template developers which provides little incentive for developing templates as a business and therefore stunts the growth of the mind mapping community.
  4. For this program and all of its competitors, the icon and image libraries are never big enough. On the other hand, you can purchase separate icon and image sets from third-party packagers on the Internet if you have special image needs. iMindMap allows you to use such external pictorial elements extremely easily. My favorite new feature is that you can add icons to their library and size the icons in a custom way. iMindMap’s included images should more fully capture the fact that users of mind maps and their audiences are much more diverse in terms of ethnicity, race, gender, gender-orientation, education, and age than the included image libraries. And hey OpenGenius folks, how about some icons for numbers in colors besides orange and lime so that the color schemes of my mind maps are not destroyed if I number ideas.
  5. More free online trainings would be desirable, and most importantly trainings that do not run at the speed of a bullet train. Two minute presentations that cover 20 minutes of material are somewhat counter-productive. The current videos run too fast for new users and at time for even the most experienced users.
  6. My experience — admittedly infrequent — is that Technical Support is fairly “rigid” in that there are lots of forms to fill out before you get a real chat session going and too many requests to send them esoteric files on your computer. All in all, as technical support goes, while everybody is trying quite hard to be helpful, they ask you to conform more to what is convenient for them than what a confused user can deal with. When I want help or to make a suggestion or make a request for a new feature or default, I want to just compose a short email so OpenGenius can get the right person there in contact with me. I most definitely do not want to complete an overly complicated form. Too much technocracy in that process.
  7. Besides the books of Buzan which are not all that useful for learning the program or how to do real visual thinking in real world applications other than rudimentary management, OpenGenius needs to develop some easier access, very practical books that act as “manuals” and present information in more comprehensive ways than is done now. Old fashioned manuals that are (or can be) printed have a lot of appeal to many.

In summary, this is an amazing program that is much more than a program for mind mapping. It is unsurpassed among mind mapping programs. Additionally it is what I call a “visual thinking environment” or VITHEN. My “criticisms” are minor and do not in anyway diminish my overall evaluation of the quality of the program.

My blog at Hubaisms.com on which you are reading this review was designed and “written” largely in “iMindMap.” Most of the mind maps I use to guide my own “complicated” life were developed in iMindMap.

Exemplary job folks at OpenGenius. Version 10 is an additional large step in the evolution of the program and mind modeling.

It is not illegal in the United States to ask job candidates to take physical and psychological examinations before being hired for a job that has huge physical and psychological demands. For instance, such public employees as police officers, firefighters, military personnel, and others take appropriate physical and psychological tests both before and during employment. They also take tests to detect illegal drug use.

As Americans like to say, the POTUS job is the most powerful person in the world. There is no question that the job has the huge physical demands of prolonged periods of 12-16 hour days under high stress condition. Many problems can be made much worse by such a lifestyle. The president also makes make key decisions, often under high duress and without full data, than can affect the lives and welfare of thousands if not millions of individuals throughout the world. We all know about the use or not of the nuclear codes, but remember that the President may make decisions daily or weekly that affect the safety and well beings of US and other world citizens in profound ways such as food distribution, medical aid and research, international trade agreements, and regulations on the US stock market and financial institutions.

Most Americans agree that we want healthy law enforcement officers that can assist in situations requiring physical fitness without harm to themselves. We also agree that we do not wish to have psychologically distressed individuals without mental stability intervening into situations of aggression, ambiguity, potential harm to bystanders, or mistaking innocent individuals for those who have committed a crime. The same is true for all other first-responders, military personnel, nuclear plant operators, airline pilots, and many more.

Why are willing to let someone be hired for the job as President of the United States (through the process of majority vote) without complete physical, psychological, and neuropsychological examinations conducted by a team of physicians, psychologists, and other appropriate healthcare professionals. A team of 3-9 individuals could be appointed through some type of consensus process among professional associations and political parties. Even better, we could make use of some of the thousands of highly qualified and brilliant healthcare professionals who are officers of the US Military and already sworn to protect the Constitution and laws of the United States without regard to partisan issues.

If there was ever a time to implement this, it is before the November presidential election this year.

I do not want an individual as the “most powerful person in the world” who is physically and/or mentally unfit for the job unknowingly hired by the electorate. I am especially concerned that candidates above the age of 50 could have untreatable neurodegenerative diseases (such as I have) that affect decision making processes, especially under stress.

If POTUS is the most powerful job in the world, it should have the most stringent job requirements including physical, mental, and neuropsychological health making it possible to adequately perform the high demands of the job. Cutting through all of the politics, I believe that the physicians, psychologists, nurses, and other healthcare professionals of the US Military are capable of making competent, non-partisan judgments about fitness requirements for ensuring that the Constitution of the US is protected and followed.

Given the public statements being made by both presumptive presidential candidates this year, it is time to ensure that the contentious statements made by both are not the product of physical, mental, or neuropsychological illness, and rather are being made by motivated but angry, expressive but overly so, individuals using typical standards of normal and healthy logic and decision making. The electorate deserves to be informed about the results of such fitness exams before making the decision in November whether to hire one or none of these candidates.

130228_north_lawn_wh_600_605

 

If you are a dementia caregiver for a family member  or a professional caregiver, I bet I just got your attention. Yes, I really do want you to think about the process of providing care as a scientist would. Observe. Make up some hypotheses. Collect data over some period of time. Analyze your data by looking at your observations and seeing if they confirm your hypotheses.

When you go the doctor with your person living with dementia (PWD), show the doctor some of your “data” and present what you have concluded. See if doctor agrees with you. And in case you wonder, I believe most neurologists and psychiatrists and primary care providers would be delighted to have verbal reports every 3-6 months at follow ups.

Don’t let the words science, data, experiment, analyze, hypotheses, and confirmation scare you.

In practice this is actually pretty simple. Carry one of the ubiquitous little 3×5.5 inch notebooks around with you (Field Notes or Moleskine or Office Depot) as well as a pen or pencil. Every once in awhile, make a short sentence note of what the PWD has been doing as well emotional reactions, interest level, agitation, annoyance, laughing, and other outcomes. Note if the activity was one in which you had to participate and use a lot of energy or if it was an activity that was done semi-autonomously.

You should write down anywhere from a dozen to 50 of these notes in a day. SHORT notes. Write them down when you are not with the person under care, don’t make a big thing out of it, but you keep the small notebook in a pocket. This is not your diary or a diary of the PWD. Rather it is a simple set of observations about what was done when, how everyone involved reacted, how the PWD felt during and after it and how the caregiver felt. Which, if any, of the participants (PWD, caregiver, others) felt great distress/agitation and great interest and happiness.

Every day you should jump ahead a few blank pages in your notebook so you cannot see what you have written already. Don’t look back the first time until at least two weeks have passed

Every week or two (but more often) you should read the notes back a few days or weeks and see if there are some predictable things that happen if you leave your Mom or Dad alone to watch TV or if they are also in the room with others or they did not have breakfast at the usual time or any combinations of the factors. Do you see patterns of people and activities that almost always make the PWD calm and focused and other ones that almost always result in agitation and anger?

As you get into the swing of the research project, every time you go back and review your SHORT notes, you will get a better feel of what does or does not make the situation optimal for the person with dementia as well as for family members and the primary caregiver.

Do you have to take notes? Probably. If you write something short down, you will more accurately remember it and dozens of events that happen throughout the day or week will not get all “mushed together” in your memory.

Oh and by the way, these notes should really be fairly private. You can go back after you observe that your Dad seems to be very happy when a baseball game is on and very agitated when others interrupt the peaceful time and ask him if this is so. But you cannot pull out your notes and say that 72% of the time ….. And you can not use the notes in a punitive way. This latter point is CRITICAL and if you are going to use the data punitively against the PWD or another family member, burn all of your books and stop collecting data. And apologize and be VERY VERY contrite.

To run a great experiment of maximum usefulness to the person with dementia and the caregiver, you need to look at your notes and be objective. Your goal is to find even small things in your notes that can make life better for everyone at least some of the time. And to realize that other things just seem to happen randomly so you should not beat yourself up if your carefully planned outings to the cinema just don’t work because your Mom gets very agitated from the noise level, number of people, and high stimulation from the big screen.

Oh one last thing. If you are a PWD reading this, there is no reason you cannot keep your own research notes and try to find patterns of activities that can help your caregiver live her or his own life better by causing minimal stress to them at the same time you make your own life more meaningful. One of your huge jobs is to support your caregiver, make her or his life less difficult, and express your appreciation.

Let me be very clear. The suggestions in this post are NOT suggestions for treatment nor will the note taking and research process make your any disease process better — notes do not substitute for medical treatment or professional counseling and the level of notes you taking are not part of any type of therapy — but the process of writing down important things and going back and seeing if there are common causes of your moods and social interactions may be very useful.

Here is a mind map with some suggestions for your research project. I hope it works as well for you as it has for me, but there is absolutely no guarantee of that all. And if the process of the “research project” causes any anxiety or other negative feelings among PWDs or caregivers, it should be stopped immediately.

Click the image to expand it.

The Great Dementia Research Study

 

02

[Ok, for all of you researcher types who want to nit-pick, yes I am fully aware that this is technically not an experiment or research study but rather an exploratory program evaluation of an emergent model of excellence. But they don’t teach that in 10th grade so I took a few liberties since everyone remembers their high school labs with hypotheses, theory, observations, analysis, and conclusions.]

 

 

 

 

My generation is the first one to potentially have been using computers much of their adult life.

When I was 20 I learned the computer language FORTRAN a very early computation-scientific language. When I was 21 I used the big mainframe computer and printer with green and white bar paper to print my grad school application essay. Every school I applied to said that they had never seen anything like it. I got into a bunch of good ones.

When I was 22, I learned APL — the best computer language ever that very few people ever learned — and the original vi text editor from the original versions of UNIX. Vi was the first way you could use primitive word processing. Text editors like vi were around for about a decade before usable word processors.

When I was 26, I joined a lab that had the original IBM word processor that cost about $50,000 in 1977 dollars, supported 8 PhDs, and had the processing power of a 2016 basic iPhone (or less). As I recall that machine had about 16K (yup, K not MB or GB) of memory necessitating that it read and write on the progenitor of the modern (1980) floppy disk. We loved that machine which also had a primitive “smooth” printer (probably an inkjet, perhaps a primitive laserjet).

By the time I was 33, I was the Director of a group of programmers and psychologists in industry designing and writing software for educators, psychologists, managers, and healthcare to be run on the original (floppy disk operating system) IBM PC and the first widely distributed Apple IIe computers. When I started using PCs, there was only an IBM PC DOS; later through a well-written contract by Bill Gates’ dad, Microsoft was able to relabel the product MS (Microsoft DOS) thus enabling it to sell it to Compaq and other PC maker and eventually drive IBM out of the computer business in the next two decades.

By the time I was 35, I had founded a company and gotten an early generation PC and a first-generation laserjet. Later I had the first Compaq notebook computer (the size of an 8.5 x 11) sheaf of paper weighing about 10-12 lbs with a nifty blue on lighter blue screen.

Computers developed over the next 25 years and became cheaper, computer word processors and companies came and went, and by the time I retired (medical reasons) at the age of 60, many adults my age had started to use PCs or Macs (most in the late 1980s or 1990s) and had a home laser printer. Word processors were easy to use, pictures could be displayed, you could buy books and music and food and lawnmowers and computers and printers online, and most of the accumulated knowledge of the world was on your desk.

When people start to cognitively decline as part of typical aging, diseases, or injury, a high percentage already know how to order a pizza on their computer after they can no longer drive and download the most recent movies even when they no longer wish to go to movie theaters. Some can even manage to access their online medical records using arcane and stupid database systems mandated for all healthcare providers. Even I (with all of my computer experience) am often frustrated with the online Medical Information System used by the University of North Carolina medical system.

In 2016, although many wonderful things are possible, the state of computing and its integration into services for those undergoing cognitive decline is still spotty, misunderstood by case managers and healthcare professionals and caregivers, and patients are not supported with technical issues that arise.

I was personally born at about the exact perfect time to use new computer hardware and software as it was developed and evolved and was educated at schools and worked in settings that were on the cutting edge of computer technology so I would argue that my computer skills are among the most broad of my generation.

Still, there are many issues in computing and software that are becoming more difficult to understand as they develop more sophistication and I watch my brain cells die. The biggest issue, of course, is that many seniors do not have access to current computer hardware and software which is sad as such access would possibly improve the quality of their lives, make them at least a little more independent, remove some burden from unpaid caregivers, and cut costs in the healthcare system that far exceed the cost of distributing computers to the financially-challenged elderly.

The situation can be characterized as “The GOOD, The BAD, and The UGLY.”

The following mind model (or advanced mind map) explains the issues. Please click the image to expand it.

Cognitive Decline and The Computer Generation 2016


Blue And White Super Girl - Two Thumbs Up
The_Zombie_Thinking

monster1 Emoticon 1 1459279956_thumb.jpeg 1459279831_thumb.jpeg

 

 

 

Click here for an index of all blog posts on Huba’s Integrated Theory of Mind Mapping.

 

HITMM 2016

Tony Buzan’s most controversial rule for mind mapping is to use one word per branch. People ignore it, hate it, complain about it, call such maps words I do not care to repeat here, love it, create with it, blame it, and look at other alternatives.

I find Buzan’s branches with short labels of one word to work pretty well to generate pretty pictures, and to work extremely well in mind maps being used to facilitate brainstorming. You can use this rather blind rule and obtain mind maps that work fairly well. But not as well as they could especially if you factor in the observations of many that most people find it far more confusing to read branches labelled with Buzan’s rigid format.

In the beginning of my theoretical work a couple of years ago, I thought that Buzan’s rule was the best one for labelling branches and channeling the thought process. Over the ensuing years I have come to realize that Buzan’s rule and its resultant maps are too restrictive, promote verbal rather than visual thinking, and become “stringy” especially with curved branches that most people will not understand as well as a mind map labelled with concepts (constructs, summary ideas), many of which require several words to disambiguate.

In the general case, the rule of one word per branch (OWPB) does not work very well in most applied knowledge applications. Medical diagnoses are named with more than one word explaining how they categorized or caused, Oscar-nominated films are named with labels indicating their content and setting and historical period, and complex naming rules apply to great baseball shortstops, serial killers, books, stressors, rewards, people on the street, and great vacation resorts.

If you use the Buzan rules, you are basically focusing on words as you try to find places to put single words to collectively describe some complicated idea. Buzan’s rule reinforces the idea of word dominance rather than picture-visual dominance! If you put one concept on each branch (with several words needed to describe many concepts), you are focusing more on the underlying concept (a visual datum) and not a specific word.

Huba’s rule of one concept per branch supports true visual thinking about concepts that can be pictured. It promotes integration and understanding and theory. Diagrams are better labelled with a full concept than labelled with several successive branches of individual words as Buzan would have us do.

Huba’s OCPB rule promotes full visual thinking; Buzan’s OWPB rule promotes an encyclopedic knowledge of individual words at the loss of the visually complex object. Huba’s OCPB rule promotes full visual thinking. Buzan’s OWPB rule promotes a fracturing of basic concepts into a form that does not portray the full richness of ideas and their visual nature.

Here are some more thoughts in the form of a mind map. Click on the image to expand it.

1concept

This slideshow requires JavaScript.

Why does Buzan’s theory fall apart at the idea of one word per branch rather the more correct and useful representation of one concept per branch. I believe it is because Buzan’s original rules of mind mapping from the 1970s and 1980s are based upon a digital model of the brain’s data processing functions (a set of “on/off” switches or those little pixels in your computer monitor that turn off and on to represent a picture in full color) that was commonly misused from the 1950s through the 1980s, and still is even today. The brain and its workings are analog. Lots of information “clumps” together rather than being a bunch of on/off switches in various locations. Analog devices use “degrees of on” to convey information as contrasted to a discrete one/off, yes/no digital device. Also, information is blended from many different sources and brain locations to construct the information as a concept or idea or map within the brain. More precisely, the brain is a stochastic device that mixes multiple neurons firings (primarily digital inputs or sources of information into a more analog continuous form) in part by accounting for a random component of erroneous information added due to a number of conditions (including brain disease effects). Or you can call it an analog device that makes probabilistic predictions. Or you can just say that it is much more complicated than the assumptions underlying Buzan’s one word per branch rule.

In order to maximize the usefulness of mind mapping and to promote greater use for such important issues as dealing with dementia and other medical conditions, personal and professional planning, decision making, communication of visual ideas big and small, learning, theorizing, remembering, and many more, we need to maximize the usefulness of the visual thinking model underlying mind mapping and move to the concept (construct, image based) system of constructing mind maps.

The decision about how to label the ideas in a mind map (whether by labelling them with single words as Buzan requires as opposed to single ideas I call constructs or concepts) is the most important one that is made in mind mapping.

Click here for an index of all HITMM 2016 blog posts.

HITMM  2016

[Click on all images to expand them.]

[My comments below pertain to left-to-right written languages. For right-to-left languages can you just assume you can translate my words into right-to-left? I suspect so, but only native speakers, readers, and writers of right-to-left languages will be able to answer that. I encourage their thoughts and comments.]

Buzan’s rules or guidelines (or “laws” of mind mapping) require a central element from which other ideas flow in a hierarchical way with the most important parts of ideas represented near the center. That is, there is a central idea often represented by a picture (Buzan says always this will be the case but his iMindMap program has the majority of suggested central elements as outline images in which words are written so apparently this is not a rigid requirement) and a hierarchy of sub-branches emerging from branches emerging from the central idea.This rule has two major problems.First, often the format makes for overly compressed branches and sub-branches and a lack of “white space.”Second and most important, the left side of the diagram (that is to the left of the central image — where one effectively has to write and read in a right-to-left manner the opposite of the way one normally writes and reads — makes it quite difficult for many people to write or read when the map is complete. Many try to read the left side of the diagram from left-to-right and end up with ideas that look like Yoda wrote them down. “This isn’t good” although some would think it is cool to say “Good isn’t this” with the implied acumen of Yoda.Here is a typical radial mind map with the major idea in the center and secondary and tertiary ideas radiating out from the center.The example is from a recent blog post on some advantages that mind mapping might make for persons with dementia (PWDs) or those with cognitive impairment. I wrote this in the traditional Buzan radiant style. The prior post gives a rationale for, and explanations of, the mind map.

Radial (Circular)

What happens when the radial mind map is oriented left-to-right. Here is a first sample of the re-orientation. Is this this the way you normally think when you read (left-to-right)? Most importantly, when you take into account that physicians and other healthcare providers are used to working in a left-to-right world, the left-to-right structure is more compelling when one READS a mind map.

Left

Here is a second variation on the left-to-right concept. Some may find this easier to unambiguously read.

Top

Buzan argues that all mind maps should have curved branches because those are more “interesting” to the easily bored brain. I don’t agree with Buzan on this matter because a linear format with straight branches seems to be more UNDERSTANDABLE to me for those who primarily read mind maps others have created. Here is the same mind map with a left-to-right format and straight branches.For the purposes of reading or filling out a standardized template, the left-to-right linear map may be clearer to the cognitive challenged or to those who handle large amounts of conceptual data daily and cannot afford to make errors (healthcare providers).

Left Linear

How do I reconcile the differences and strengths among these four formats.

  1. I believe that it is easier to WRITE or BRAINSTORM or CREATE in the radiant format. The radiant design has the advantage of clearly indicating the most important parts of the idea or information. Important information appears in the center and branches and sub-branches gradually emerge.
  2. But for reading or processing formation from one person to another or filling a pre-designed form, the left-to-right linear format may be the best or at least the easiest format for people to quickly and accurately transmit information. And the linear left-to-right format is a natural for healthcare where information is transmitted hundreds of times through both individual hands and scanned documents that may also be computer interpreted or reformatted for databases. And the left-to-right format with linear branches is probably the easiest to understand by a person with cognitive impairment or unfamiliarity with the radial format of rigidly Buzan-style maps.

If you read this blog regularly, you will know that I have thousands of mind maps lying around that were created with a traditional radial format. How long does it take me to convert a radial mind map into a left-to-right oriented one? TEN MINUTES in the program iMindMap created by Chris Griffiths in tandem with Tony Buzan. Conversion is a semi-automated process that requires some judgment about the final arrangement of the branches. But if a person who WRITES or CREATES the radial mind map and then converts it to a left-to-right format to COMMUNICATE to patients and doctors and nurses and more doctors and then the patient again, that little extra knowledge about mind map USERS is readily available.

To summarize, I find it easiest to create (write) new content in the radial format but strongly suspect that most users will find it easier to read that content in one of the left-to-right formats.

But, remember that I am working in the field of healthcare. And, I believe that mind mapping can help me live better with several medical conditions I have.

My solution will be to present two alternately formatted mind maps on this blog and in explanatory articles and manuals.That is, for many I will include both a radial mind map for further brainstorming and editing and rewriting and a left-to-right linear or almost-linear map for readers and others who find the traditional reading orientation best.Some readers will find the radial format most valuable. Some readers will find the left-to-right format more useful. In general, the choice of radial versus left-to-right is one that rests on the content of the map, the intended audience, the overall system in which the information is being used, and an understanding of the typical cognitive functioning and training of the intended audience.And it does not hurt to present the information in both formats so that everyone is covered and also becomes familiar with both formats.Does current neuroscience prefer one of these formats over the other? I do not find any compelling research (when I find any research at all) that shows radial diagrams are superior to left-to-right ones. Such evidence did not exist in the 1970s and it does not seem to exist now, although research will continue and we will need to adjust our conclusions as more “definitive” findings are produced with better equipment, better research designs, and better data.

Flower1

Since the 1960s, the US Congress has borrowed money from the Medicare and Social Security Trusts — money deducted from the salaries of all workers — and used it for defense, foreign aid, defense, social services, defense, medical services, wars, defense, bombs, and Congressional pork.

Now, Trusts funded by by workers’ salary deductions to pay for retirement living and healthcare expenses are supposedly going to go bankrupt because dementia is reportedly going to destroy these programs.

Truly one of the great cons in history. For decades Congress has committed fraud by stealing from the pension fund Trusts to pay for goodies for special interests that have resulted in their own re-elections.

Now, instead of getting mad at Congress and throwing the big violators in prison, we (or my children) are supposed to get angry at our parents for having dementia.

The US has prosecuted organized crime for decades for stealing from pension plans.

Somehow we forgot to prosecute the biggest violator.

Stop blaming dementia for the destruction of Social Security and Medicare. Responsible elected officials would not have stolen from these trusts or would not have enacted tax cuts while taking money from needed social network services from the future.

There is a way to fix this problem available to you in November. I hope you use it.

I hate clutter. I’ve always had far too much stuff around, usually that shoved in an unused closet, the garage, storage. Old questionnaires, old clothes, old pens most run dry by now, old external data drives (in case I need an email from 1990), old office supplies, old books from grad school in the 1970s that no current grad student wants for free (no grad students buy boor professional journals in paper formats any more), boxes of new file folders (which no one uses in the computer ages), old jeans (as sizes went up and sometimes down over the years), t-shirts from the 1980s, and who knows what else. I also inherited a bunch of family heirlooms (but mostly junk but including my treasured Eagle Scout badge and transcript from grad school) in boxes from my mother. The organization systems I have tried to implement since the early 1970s have never really worked that well.

And the reason “disorganized and cluttered” could dealt with easily was that I had a very organized mind and my memory was like a steel trap; if I had observed or read or heard it, the information was there. And, damn it, I never learned to clean up after myself because there were always more exciting and new things to do. And if I needed something the odds were extremely high I could find it the random box where it had been placed.

And this was before the Internet, before the Internet with Google, before the Internet with voice-controlled Google. Information organization needs have exploded.

And this was before I had neurodegenerative disease with memory loss, significantly lowered ability to multi-task and make decisions, a big temper when frustrated, lowered ability to separate perceptual field from ground (or the object I wanted in the clutter), and many other dementia symptoms.

Now clutter just destroys me. I waste much time every day trying to find things, organize things, decide what to throw out and what to keep, and putting things where I can find them. I get extremely anxious and agitated in clutter but cannot figure what to discard without then facing a world-shattering event without the one paper or piece of clothing or knife or key or medical records that would save the world.

2016_02_09_21_54_58

(more…)