top of page

Critical Thinking in Child Welfare   

(Originally published April 2021)

Critical  thinking in child welfare (as in other professions) has multiple dimensions. I have done critical thinking training with child welfare staff in Washington State for several years. At the beginning of training, I ask participants, “What ideas do you associate with critical thinking?” and “What does it mean to describe someone as a good critical thinker?” A common answer from child welfare caseworkers and supervisors is “thinking outside the box,” i.e., creative problem-solving, characteristic of famous fictional detectives or great scientists. Creative thinking in detective work or science might include exceptional observational skills, lighting fast deductive thinking and/or inspired hypotheses that can be tested in experiments. In this view, critical thinking is creative thinking with a goal in mind.

 

Child welfare staff sometimes mention the ability to accurately interpret information gathered in assessments or investigations, with an implicit reference to analytical skills. In my experience, improving analytical skills requires careful attention to the logic of arguments and intellectual debate conducted according to strict rules, i.e., no ad hominem arguments, no insults, no deference to authority that is not supported by evidence. Brief training is not an effective approach to improving analytical skills, though trainers can offer conceptual frameworks for helping child welfare staff’s understanding of a subject.

 

Many child welfare staff have a heartfelt desire to combat bias, i.e., fixed opinion closed to new information, through critical thinking but have rarely heard of heuristic biases discussed by Daniel Kahneman in his book, Thinking Fast and Slow (2010).

 

I have yet to hear child welfare staff at any level refer to scholarly virtues such as making critical distinctions between key concepts (such as risk vs. safety or legal permanency vs. relational permanency);  or taking a hard look at the evidence for beliefs, including the application of rigorous evidentiary standards to the evaluation of child welfare practices and programs. My impression from these trainings is that evidence based conceptual frameworks have not yet had a strong impact on the thinking of  child welfare caseworkers and supervisors in this state, though this is only an impression.

 

I begin training with the acknowledgement that critical thinking is a big subject, and that any brief training is necessarily an idiosyncratic version of the elements of critical thinking a trainer believes have the greatest potential for improving child welfare practice. I also suggest that until professionals who make decisions which affect the lives of children and families have a gut level realization that they are vulnerable to error, critical thinking is likely to be an abstract intellectual discussion lacking personal relevance. In my experience, there is a dramatic shift in attention and interest in critical thinking once a child welfare practitioner has had the experience of being seriously mistaken regarding an assessment or decision which has led to grievous harm to a child, family, or stakeholder. Until a caseworker has had this experience, she/he may live in a state of innocence, confident in their intuitions and judgments with seemingly no reason for critical self-reflection. The strong motivation to improve critical thinking skills often begins with a serious error in caseworker or supervisory judgment that led to harm.  Any experienced child welfare practitioner who has had this experience -- and is courageous enough to acknowledge their mistake to themselves and others -- does not want to repeat it. 

 

Thinking about thinking  

 

A useful way to approach critical thinking in child welfare - and the approach I favor in brief trainings -  is ‘thinking about thinking’ that is a part of high stakes decision making. ‘Thinking about thinking’ involved in decision making processes requires self-reflection regarding:

 

  • How a person goes about assimilating information when a decision does not have to made under extreme time pressure, e.g., by reviewing a large amount of case information, or data; or depending on a concise summary of information; or by listening to a story about a child or family from someone who knows the family well.

  • The cognitive processes described in Thinking Fast and Slow , i.e., the energetic, rapid, intuitive, confident judgments of Kahneman’s System 1 in which a small amount of information is quickly transformed into a story that connects the dots and fills in the blanks imaginatively (“What You See Is All There Is” – WYSIATI); or the slow, lazy, deliberate, analytical way of System 2.

  • The emotional influences on decision making embodied in heuristic biases, i.e., shortcuts to processing information that systematically leads to error.  Emotional factors are a powerful influence in heuristic biases such as confirmation bias and halo effect, and in substantive biases as well; and motivates common defense mechanisms such as denial and projection.

 

As a rule, it is far easier to influence decision makers in every organization  to critically reflect on what they’re doing than to redirect their attention to how they’re doing it. How caseworkers conduct investigations and assessments and how organizations are managed is partly the result of both character and culture, both of which may influence decision making  without conscious reflection. For this reason, improvements in critical thinking often depend more on enhanced self-awareness than on acquiring better problem-solving skills or analytical skills. To paraphrase Daniel Kahneman: System 2 with its analytical approach to decision making  usually serves the needs of System 1 to rationalize its desires, preferences and intuitions, rather than vice versa.  According to Kahneman, System 1 has unlimited energy while System 2 has limited energy sources and is therefore lazy.  In the absence of self-reflection, the tendencies of System 1 have full sway: quick, intuitive judgments based on a thin amount of information, along with confidence based on the strong gut feelings of System 1.

 

Over-confidence in its judgments is characteristic of System 1.  Unreflective people may believe that the strength of conviction is an indicator that their judgment is correct. The habit of checking System 1, e.g., by postponing judgment until there is more information, or by refraining from creating plausible stories based on little or no information, requires self-reflection, i.e., thinking about thinking, and awareness of the emotions that influence the decision-making process. Caseworkers’ positive or negative feelings regarding a parent sometimes influence case plans, just as juries may bring in verdicts of guilt or innocence in criminal trials because they have developed a dislike of a prosecutor or defense attorney, or because they don’t care for the demeanor of the defendant.

 

System 1 takes the lead when crucial decisions must be made under extreme time pressures; but even when there is time to reflect on decisions, how a person prefers to assimilate information can have a large effect on decision making. For some caseworkers and supervisors, more information regarding a child and family is always better than less, while for others too much information is as injurious to decision making as too little information.  Too much information can lead to confusion rather than clarity.   

 

Policymakers and top managers of organizations rely almost totally on brief summaries of information, which may be well done or highly distorted and manipulative. Any decision maker who depends on summaries has turned over considerable power to the staff person (or scholar) who prepared the summary. For this reason, legislative staffers and analysts in the executive branch of government exercise enormous behind the scenes influence on policymakers.  Scholars who want studies or reports to influence public policy quickly learn to pay attention to their Executive Summary, as this is the only part of their report likely to be read by someone with political influence, usually a staffer. A summary of pertinent information and possible solutions is far more likely to influence policymakers and top managers than a fifty-page report or a lengthy research study.    

 

Stories are a powerful means of persuasion, far more so than presentations of data, because stories give information emotional weight and meaning and connect the dots in familiar ways. However, stories are also potent carriers of bias (often implicit).  In addition, it is difficult for many storytellers to refrain from adding fictional elements to stories to increase audience response. Anyone who prefers stories as a way of coming to grips with case information or complex issues should learn to respond to an emotionally powerful, interesting story with the question: “this is a fascinating story but is it true?”  The answer is frequently, “no,” or “true in part with fictional elements.”  Furthermore, it is often a poor idea to begin a discussion of public policy with a story if the goal is dispassionate consideration of policies that may affect thousands or millions of lives. In considering changes in law, policymakers and advocates should pay careful attention to relevant data. Stories can generate powerful motivation to act, but public policy should be developed through careful dispassionate reflection, as well as discussion with members of affected populations.

 

How to improve decision making skills

 

Decisive: How to Make Better Choices in Life and Work (2013) by Chip and Dan Heath is an excellent practical primer regarding how to combat “the four villains of decision making: (1) narrow framing and the spotlight effect; (2) the powerful influence of confirmation bias; (3)  short term emotions and desires; and (4) overconfidence in our ability and the ability of experts to predict the future. The Heath brothers utilize an acronym, WRAP, to develop their strategies for improving decision making skills. These strategies require awareness of the mental/emotional tendencies, i.e., “the four villains” during the decision-making process.  

 

The spotlight effect and narrow framing

 

According to Kahneman, the tendency to jump to conclusions (WYSIATI) based on thin information is reinforced by System 1’s “ radical indifference to the quantity and quality of information.” One of the first skills child welfare caseworkers and supervisors need to develop is an understanding of the amount and type of information required to make sound decisions. There are occasions when decisions must be made quickly based on initial impressions and instinct, along with policy guidelines. Nevertheless, the strong tendency among caseworkers with excessive workloads to depend on intuition and gut feelings, i.e., “when in doubt, trust your gut,” should be firmly resisted by inexperienced staff. Intuition based on holistic pattern recognition (per Gary Klein’s book, Sources of Power: How People Make Decisions) is characteristic of experts in multiple professions, including child welfare. However, dependence on intuition is a formula for disaster when caseworkers are inexperienced. Caseworkers with less than 12-18 months experience should be advised to check their intuitions (not “stuff” them) with their supervisor or an experienced caseworker highly regarded by her/his peers.    

 

It is important to remember that more information is not always better; for many people there is an optimum level of information on which to make specific decisions. Inexperienced caseworkers need conscientious supervision, coaching and/or peer mentoring to develop a sense of how much information is enough in making different kinds of decisions.

"W" stands for widening options

 

“Narrow framing” is the tendency to reduce important decisions to an “either/or” format, e.g., to place a child in foster care or not place a child. Skilled decision makers widen their options (W in the acronym). For example, an experienced CPS caseworker should be able to develop one or more in-home safety plans as an option when safety threats are apparent. Further, they should be well informed regarding the likelihood that one or more of these options would be an effective way of protecting a child, depending on the child’s age and parental motivation to implement the plan. The tendency to reduce important decisions to “either a or b” should be strongly resisted. 

 

Countering the power of confirmation bias

 

In my view, no one, regardless of IQ or degree of self-awareness, can overcome confirmation bias without the assistance of others. Confirmation bias is by far the most powerful heuristic bias in child welfare and in every other profession. Everyone, without exception, operates differently once they have made up their mind and declared their view in a social setting. At this point, decision makers are “in the tunnel” (to quote Michael Connelly’s fictional detective, Harry Bosch). Persons who have staked their reputation to even the slightest degree on an opinion, or perspective, primarily look for evidence that they are right and ignore or reject evidence that they are wrong. Ugly or dumb substantive biases are difficult to change because a biased person looks for and finds evidence that confirms their beliefs; and ignores or dismisses evidence that disconfirms their views. For this reason, truth seeking in child welfare (as in science) is a social enterprise. No one thinks clearly or well without critical feedback from knowledgeable peers, a guideline American culture has sought to ignore in recent decades. Too much agreement ends critical thinking but may enhance implementation efforts. Learning organizations must strike a balance between tolerance for critical “yes-buts” with commitment to a definite course of action.    

 

Anyone who wants to become a better decision maker must develop  tolerance for critical feedback, in part by slowing down and listening to critics, and by contributing to a social environment in which it is acceptable to ask peers, supervisors and managers hard questions (expressed in a civil manner) and expect a rational reply. The capacity for critical thinking crucially depends on support in the social environment, as well as on personal habits. Organizations that suppress critical feedback from the bottom and middle levels of the bureaucracy to top managers invite disaster, e.g., Boeing, Wells Fargo, Volkswagen.

 

"R" stands for reality check.  

 

Resisting short term desires and impulses

 

Anyone in child welfare who loses control of their temper in meetings with stakeholders, or with staff above or below them in the chain of command, will pay a big price in interpersonal relationships and in reputation within and outside the agency. For-profit companies may choose to maximize short term profit at the expense of long-term viability. In recent years,  

public child welfare agencies have frequently been damaged by the desire of middle managers to look good on performance indicators at the expense of the workforce and at the expense of good practice. Most experienced child welfare staff understand the organizational tendency “to look good at the expense of being good.” (Shay, 2002)

 

"A" refers to  “attain distance from short term desires.”

 

Prepare to be wrong

The Heath brothers fiercely attack the idea that experts have a greatly enhanced ability to predict the future.  Rather, the distinguishing characteristic of expertise (according to Decisive) is knowledge of base rates, for example the base rates for a screened-in CPS re-report of cases classified at high risk, moderate risk, or low risk at case closure within 12 months.  Given widespread overconfidence in prediction, skilled decision makers develop “trip wires,” i.e., indicators that a decision is not working out as planned that allow quick action to correct mistakes.

 

"P" fills out WRAP

 

Widen your options, do a reality check, attain distance from short term desires and impulses; and prepare to be wrong.  WRAP is a self-help program for decision makers at all levels of child welfare, as well as an organizational development strategy.  

 

References  

 

Heath, C. & Heath, D. Decisive: How to Make Better Choices in Life and Work (2013), Crown Business, New York City.  

 

Kahneman, D., Thinking Fast and Slow (2011), Farrar, Straus & Giroux, New York City.

 

Klein, G., Sources of Power: How People Make Decisions (1998), The MIT Press, Cambridge, Mass.

 

Shay, J., Odysseus in America: Combat Trauma and the Trials of Homecoming (2002), Scribner, New York City.           

©Dee Wilson     

  

deewilson13@aol.com

    

bottom of page