The Language and Process that Helps Schools Vet New Products and Interventions (Part I)
[CLICK HERE to read this Blog on the Project
ACHIEVE Webpage]
Dear Colleagues,
Introduction: The Big Game
Let’s be honest.
Taylor Swift aside, it seems that Super Bowl viewers are divided by those watching the game versus the commercials.
And the commercials in this month’s (2024) Super Bowl seemed to bring out more “Stars” than usual: Ben Affleck, Matt Damon, Tom Brady, JLo, Jennifer Aniston, David Schwimmer, Jenna Ortega, Chris Pratt, Addison Rae, Jelly Roll, Judge Judy, Ice Spice, Lionel Messi, Kate McKinnon, Vince Vaughn, Quinta Brunson, Wayne Gretzky, Christopher Walken, and more.
Stars make commercials memorable. . . using their presence to implicitly or explicitly endorse products. And this gives these products traction, and traction sells products.
No one really cares
if the Star has used or validated the quality of the product. In fact, in the
final analysis, some “great” Stars endorse “lousy” products.
_ _ _ _ _
Super Bowl commercials use language strategically.
The most strategic language in a commercial is the slogan. Great slogans remain in the brain forever, and they are immediately (re)associated with their product. . . even when the slogan hasn’t been used for a decade or more.
For example, for the Baby Boomers out there. . . what products are associated with the following slogans? (Sorry, Gen Z’s, you’ll have to Google these on your own):
·
“I can’t believe I ate the whole thing.”
·
“I bet you can’t eat just one.”
·
“Where’s the beef?”
·
“You deserve a break today.”
·
“Put a tiger in your tank.”
·
“Cleans like a white tornado.”
·
“What happens in Vegas, stays in Vegas.”
·
“America runs on Dunkin’.”
_ _ _ _ _
When a slogan “catches on” in the popular vernacular, it pays unexpected (pun intended) product dividends. . . literally.
Just like the “Stars” above, a popular Slogan sells products. But the slogan truly has nothing to do with the quality of the product.
Regardless of Stars and Slogans, the quality of a product is based on whether it “does the job” effectively, efficiently, and in a cost-effective way. While the Star and Slogan may be responsible for the first purchase, the commitment to re-purchase the product and become a “lifetime user” is based on data, efficacy, and customer experience.
And this is true
whether the product costs $5.00 or $50,000.
_ _ _ _ _ _ _ _ _ _
Media Psychology, Media Literacy, Product Literacy, and Buying New Educational Stuff
Media Psychology is a “newer” branch of psychology that examines the ways people are impacted by media and technology. Consumer psychology, meanwhile, studies how people’s thoughts, beliefs, emotions, and perceptions influence what they buy and use.
In today’s digital, on-line world, these two areas are virtually (sorry. . .) interdependent.
So what does all of this have to do with education?
Educators. . . on their own behalf, and on behalf of their students, schools, districts, and/or other educational settings. . . are consumers who are influenced by media.
We purchase curricula, on-line instructional or intervention products, assessment or evaluation tools, professional development programs, and direct and indirect consultation services.
We, sometimes, are influenced by famous and notable authors, speakers, researchers, and product developers (the “Stars”) and their status or testimonials. . . and sometimes by Slogans, marketing campaigns, and “research” that would never be accepted by the Editorial Board of any refereed professional publication in our field.
Just like Middle
School students (see below), educators need to be trained in Media Literacy so
they can be Product Literate.
_ _ _ _ _
Media Literacy From Middle School Students to “Middle School” Educators
A November 14, 2023 article in U.S. News & World Report, stated:
The rise of the internet and social
media makes it easier than ever to access information – and that includes
information that’s false or misleading.
Add to that increasing political
polarization, eroding trust in mainstream media and institutions, the rise of
artificial intelligence products and a tendency to dismiss any information one
doesn’t agree with, and many agree that the ability to critically evaluate
information is a vital subject for schools.
When you get to AI-generated content, it makes fact-checking even more necessary because AI-generated images, texts and narratives can be inaccurate, biased, plagiarized or entirely fabricated. It can even be created to intentionally spread disinformation. We’re talking about media literacy, and within media literacy is news literacy.
The need for better media literacy
education in schools is nearing a breaking point, some experts say, and health
professionals have made public pleas saying as much. In May 2023, U.S. Surgeon
General Admiral Vivek H. Murthy called on lawmakers to support media literacy
in schools, while the American Psychological Association issued a health
advisory recommending teenagers be trained in social media literacy before
using the platforms.
Media literacy education teaches
students to think critically about media messages and to create their own media
“thoughtfully and conscientiously,” according to Media Literacy Now. According
to (its annual) report: Ohio, New Jersey,
Delaware and Florida require K-12 media literacy standards. New Jersey, Delaware and Texas require K-12 media
literacy instruction. Illinois, Colorado,
Massachusetts Nebraska and Connecticut require some limited form of media
literacy instruction. Nebraska and Minnesota
require standards in some grades and subject.
What Does a Good Media Literacy Program
Look Like?
A good media literacy program starts by
teaching students how to ask good questions and become interrogators of
information, experts say. By the end of middle school, students should be able
to read laterally, meaning they can use the internet to check the veracity of
news they see online.
_ _ _ _ _
Common Sense Media, in a June 4, 2020 article, expanded on the goals of a sound media literacy program and the resulting questions that Middle School students should be able to answer.
Upon completion, the program should help students:
- Learn to think critically. As kids evaluate media, they decide whether the messages make sense, why certain information was included, what wasn't included, and what the key ideas are. They learn to use examples to support their opinions. Then they can make up their own minds about the information based on knowledge they already have.
- Become a smart consumer of products and information. Media literacy helps kids learn how to determine whether something is credible. It also helps them determine the "persuasive intent" of advertising and resist the techniques marketers use to sell products.
- Recognize point of view. Every creator has a perspective. Identifying an author's point of view helps kids appreciate different perspectives. It also helps put information in the context of what they already know -- or think they know.
- Identify the role of media in our culture. From celebrity gossip to magazine covers to memes, media is telling us something, shaping our understanding of the world, and even compelling us to act or think in certain ways.
- Understand the author's goal. What does the author want you to take away from a piece of media? Is it purely informative, is it trying to change your mind, or is it introducing you to new ideas you've never heard of?
_ _ _ _ _
The key questions that students should learn to ask are:
Who created this? Was it a company? Was it an individual? (If so, who?) Was it a
comedian? Was it an artist? Was it an anonymous source? Why do you think
that?
Why did they make it? Was it to inform you of something that happened in the world
(for example, a news story)? Was it to change your mind or behavior (an opinion
essay or a how-to)? Was it to make you laugh (a funny meme)? Was it to get you
to buy something (an ad)? Why do you think that?
Who is the message for? Is it for kids? Grown-ups? Girls? Boys? People
who share a particular interest? Why do you think that?
What techniques are being used to make this message
credible or believable? Does it
have statistics from a reputable source? Does it contain quotes from a subject
expert? Does it have an authoritative-sounding voice-over? Is there direct
evidence of the assertions its making? Why do you think that?
What details were left out, and why? Is the information balanced with different
views -- or does it present only one side? Do you need more information to
fully understand the message? Why do you think that?
How did the message make you feel? Do you think others might feel the same way?
Would everyone feel the same, or would certain people disagree with
you? Why do you think that?
_ _ _ _ _
Shifting this back to the need for educators to be Media and Product Literate, I would like you re-read the above Common Sense Media quotes, substituting the word “Educators” for “students” and “kids.” As you do this, think about a time when you are reviewing and evaluating, for purchase, the marketing materials or an on-line review of a curriculum, on-line instructional or intervention product, assessment or evaluation tool, professional development program, or direct and indirect consultation service.
Hopefully, enough said.
Like a media
illiterate Middle School student (with no disrespect intended), how many times
do we fall prey to product hyperbole, misinformation, and barely perceptible deception.
. . and how often should we ask colleagues with more sophisticated
psychometric, technical, or evaluative skills for an independent product
appraisal?
_ _ _ _ _ _ _ _ _ _
Analyzing Some Product Literacy Language
One of the “slogans” in educational research-to-practice involves the terms used to suggest that a curriculum, on-line instructional or intervention product, assessment or evaluation tool, professional development program, or direct and indirect consultation services has been validated.
The most common terms are: “Scientifically-based,” “Evidence-based,” and “Research-based.”
Because these terms are often thrown into the marketing descriptions of certain products, it is important for educators to understand (a) their definitions, histories, and how they differ; and (b) the questions and objective criteria needed to determine that a product validly provides the student, staff, or school outcomes that it asserts (and that one or more of the terms above can be used accurately).
Relative to history and definitions, we need to consider the Elementary and Secondary Education Act of 2001 (No Child Left Behind—ESEA/NCLB) and 2015 (Every Student Succeeds Act—ESEA/ESSA); and ESEA’s current “brother”—the Individuals with Disabilities Education Act (IDEA 2004).
Let’s go term by
term.
_ _ _ _ _
Scientifically Based
This term appeared in ESEA/NCLB 2001 twenty-eight times, and it was (at that time) the “go-to” definition in federal education law when discussing how to evaluate the efficacy, for example, of research or programs that states, districts, and schools needed to implement as part of their school and schooling processes.
Significantly, this term was defined in the law. According to ESEA/NCLB:
The term scientifically based research—
(A) means research that involves the application of
rigorous, systematic, and objective procedures to obtain reliable and valid
knowledge relevant to education activities and programs; and
(B) includes research that—
(i) employs systematic, empirical methods that draw on
observation or experiment;
(ii) involves rigorous data analyses that are adequate
to test the stated hypotheses and justify the general conclusions drawn;
(iii) relies on measurements or observational methods
that provide reliable and valid data across evaluators and observers, across
multiple measurements and observations, and across studies by the same or
different investigators;
(iv) is evaluated using experimental or
quasi-experimental designs in which individuals, entities, programs, or
activities are assigned to different conditions and with appropriate controls
to evaluate the effects of the condition of interest, with a preference for
random-assignment experiments, or other designs to the extent that those
designs contain within-condition or across-condition controls;
(v) ensures that experimental studies are presented in
sufficient detail and clarity to allow for replication or, at a minimum, offer
the opportunity to build systematically on their findings; and
(vi) has been accepted by a peer-reviewed journal or
approved by a panel of independent experts through a comparably rigorous,
objective, and scientific review.
_ _ _ _ _
The term “scientifically based” is found in IDEA 2004 twenty-five times—mostly when describing “scientifically based research, technical assistance, instruction, or intervention.”
The term “scientifically
based” is found in ESEA/ESSA 2015 ONLY four times—mostly as “scientifically
based research.” This term appears to
have been replaced by the term “evidence-based” (see below) as the “standard”
that ESEA/ESSA wants used when programs or interventions are evaluated for
their effectiveness.
_ _ _ _ _
Evidence-Based
This term DID NOT APPEAR in either ESEA/NCLB 2001 or IDEA 2004.
However, it does appear in ESEA/ESSA 2015—sixty-three times (!!!). . . most often when describing “evidence-based research, technical assistance, professional development, programs, methods, instruction, or intervention.”
Moreover, as the new (and current) “go-to” standard when determining whether programs or interventions have been empirically demonstrated as effective, ESEA/ESSA 2105 defines this term.
As such, according to ESEA/ESSA 2015:
(A) IN GENERAL.—Except as provided in subparagraph
(B), the term ‘evidence-based’, when used with respect to a State, local
educational agency, or school activity, means an activity, strategy, or
intervention that
‘(i)
demonstrates a statistically significant effect on improving student outcomes
or other relevant outcomes based on—
‘(I)
strong evidence from at least 1 well-designed and well-implemented experimental
study;
‘(II)
moderate evidence from at least 1 well-designed and well-implemented
quasi-experimental study; or
‘(III)
promising evidence from at least 1 well-designed and well-implemented
correlational study with statistical controls for selection bias; or
‘(ii)(I)
demonstrates a rationale based on high-quality research findings or positive
evaluation that such activity, strategy, or intervention is likely to improve
student outcomes or other relevant outcomes; and
‘(II)
includes ongoing efforts to examine the effects of such activity, strategy, or
intervention.”
(B) DEFINITION FOR SPECIFIC ACTIVITIES FUNDED UNDER
THIS ACT.—When used with respect to interventions or improvement activities or
strategies funded under Section 1003 [School Improvement], the term
‘evidence-based’ means a State, local educational agency, or school activity,
strategy, or intervention that meets the requirements of subclause (I), (II),
or (III) of subparagraph (A)(i).
_ _ _ _ _
Research-Based
This term appeared in five times in ESEA/NCLB 2001; it appears four times in IDEA 2004; and it appears once in ESEA/ESSA 2015. When it appears, the term is largely used to describe programs that need to be implemented by schools to support student learning.
Significantly, the
term researched-based is NOT defined in either ESEA law (2001, 2015), or by
IDEA 2004.
_ _ _ _ _ _ _ _ _ _
What You Should Know and Ask When Evaluating Programs Using these Terms?
Scientifically Based
At this point in 2024, if a product developer uses the term “scientifically based,” s/he probably doesn’t know that this term has functionally been eliminated as the “go-to” term in federal education law. At the same time, as an informed consumer, you can still ask the developer what s/he means by “scientifically based.”
Then. . . if the developer continues to say that his/her product is scientifically based, you request and review the validating research studies—that are preferably published in refereed journals—with descriptions of their:
· Demographic backgrounds and other characteristics of the students participating in the studies (so you can compare and contrast these students to your students);
· Research methods used in the studies (so you can validate that the methods were sound, objective, and that they involved control or comparison groups not receiving the program or intervention);
· Outcomes measured and reported in the studies (so you can validate that the research was focused on student outcomes, and especially the student outcomes that you are most interested in for your students);
· Data collection tools, instruments, or processes used in the studies (so that you are assured that they were psychometrically reliable, valid, and objective—such that the data collected and reported are demonstrated to be accurate;
· Treatment or implementation integrity methods and data reported in the studies (so you can objectively determine that the program or intervention was implemented as it was designed, and in ways that make sense);
· Data analysis procedures used in the studies (so you can validate that the data-based outcomes reported were based on the “right” statistical and analytic approaches);
· Interpretations and conclusions reported by the studies [so you can objectively validate that these summarizations are supported by the data reported, and have not been inaccurately- or over-interpreted by the author(s)]; and the
·
Limitations reported in the studies (so you
understand the inherent weaknesses in the studies, and can assess whether these
weaknesses affected the integrity of and conclusions—relative to the efficacy
of the programs or interventions—drawn by the studies).
_ _ _ _ _
Evidence-Based
If a product developer describes a program or intervention as “evidence-based,” you need to ask them whether they are using the term as defined in ESEA/ESSA 2015 (see above) and, if so, which criteria in the law their product has met.
Critically, very few educational products or psychological interventions meet the (“Gold Standard”) experimental or quasi-experimental criteria in the Law. In fact, if at all, most will only meet the following ESEA/ESSA 2015 criteria:
‘(ii)(I)
demonstrates a rationale based on high-quality research findings or positive
evaluation that such activity, strategy, or intervention is likely to improve
student outcomes or other relevant outcomes; and
‘(II) includes ongoing efforts to examine the effects of such activity, strategy, or intervention.”
As such, as an informed consumer, we suggest
that you ask the product developer all of the same questions outlined in the
“scientifically based” section immediately above. The answers will help you
determine the objective efficacy of the product, the demographics of the
students it has worked with, and the resource, time, and training needs for
success.
_ _ _ _
Research-Based
If a product developer uses the term “research-based,” they probably don’t know that the “go-to” term, definition, and standard is now “evidence-based” in federal education law.
Moreover, as an informed consumer, a developer’s use of the “research-based” term should raise some “red flags” as it might suggest that the quality of the product’s research and its efficacy may be suspect.
As such, you will need to ask the developer the same questions in the “scientifically based” section above, independently evaluating the quality of the responses.
After (a) analyzing the information from the product’s research and implementation studies, and (b) answering the evaluative questions, you can ask yourself:
·
Is there enough objective information to
conclude that the “recommended” product is independently responsible for the
student outcomes that are purported and reported?
_ _ _ _ _
· Is there enough objective data to demonstrate that the “recommended” product is appropriate for my student population, and will potentially result in the same positive and expected outcomes (if present)?
[The point here is that the
program or intervention may be effective—but only with certain students. . .
and not your students.]
_ _ _ _ _
· Will the resources needed to implement the program be time- and cost-effective relative to the “Return-on-Investment”?
[These resources include, for
example, the initial and long-term cost for materials, professional development
time, specialized personnel, coaching and supervision, evaluation, parent and
community outreach, etc.]
_ _ _ _ _
· Will the “recommended” product be acceptable to those involved (e.g., students, staff, administrators, parents) such that they are motivated to implement it with integrity and over an extended period of time?
[There is extensive research on
the “acceptability” of interventions, and the characteristics or variables that
make program or intervention implementation likely or not likely.]
_ _ _ _ _ _ _ _ _ _
Some Final Product Literacy Questions
Clearly, some products and interventions have sound research that validate their practices. As an inherent part of this validation, these products have been implemented and evaluated with intensity and integrity, and they have produced meaningful and measurable student, staff, and/or school outcomes.
But even here, recognize that—in analyzing the responses to the evaluative questions suggested throughout this Blog:
· Some products or interventions will not have demonstrable efficacy;
· Some will have some positive outcomes, but they may be over-generalized by the developer so that the product appears more successful than it really is;
· Some will have positive correlational results, but not the causal results that demonstrate that the product was solely responsible for the positive outcomes;
· Some will have demonstrated efficacy—but not be applicable to your students or circumstances; and
· Some product developers will still claim—even in the face of your objective analysis regarding the (suspect or poor) quality of the research and the compromised efficacy of the product—that it is effective.
In this last situation, research is typically compromised when it is conducted (a) by convenience; (b) with small, non-representative, and non-random samples; (c) without comparisons or matched control groups; and (d) using methodologically unsound scientific approaches.
Other “published” research needs to be reviewed when it appears on a website or in a journal that does not conduct “blind” reviews by three or more members of an established Editorial Board.
PLEASE NOTE:
Anyone can do their own research, pay $50.00 to set up a website, and begin to
market their products. To determine if the research is sound, the product generates
the results it purports, and the same results will meaningfully occur in your
school, agency, or setting, you need to do your own investigations,
analyses, and due diligence.
_ _ _ _ _ _ _ _ _ _
Summary
Metaphorically using the commercials at the Super Bowl as a guide, this Blog emphasized and outlined (applying the goals and questions within a sound middle school Media Literacy program) why educators need to be both Media and Product Literate when reviewing and evaluating the marketing materials or on-line reviews of curricula, instructional or intervention products, assessment or evaluation tools, professional development programs, or direct and indirect consultation services that may be purchased.
We then described in detail three common terms used to “validate” these products: “Scientifically-based,” “Evidence-based,” and “Research-based.”
Here, we asserted the importance that educators understand these terms’ respective (a) definitions, histories, and differences; and (b) the questions and objective criteria needed to determine that a product can validly provide the student, staff, or school outcomes that it asserts.
We understand that Social Media and Product Literacy—and their accompanying reviews—take time.
But especially for purchases that will be used or implemented for five or more years (e.g., a new reading, math, or science curriculum or on-line program; a new district Student Information or Data Management System), the review time is both responsible and essential to long-term student, staff, and school success.
A January 19, 2024 Education Week article (late last month) discussed the “Five Mistakes for Educators to Avoid When Picking ‘Evidence-Based’ Programs.”
I highly recommend
that you read and discuss this article in your educational setting. Indeed, if
you understand the thrust and nuances in this piece, you will realize that you
already have the Product Literacy foundation that you need to competently evaluate
new products or interventions.
_ _ _ _ _
Answers to the “Slogan Pop Quiz”
Oh. . . by the way. . . here are the answers to our earlier “Slogan Pop Quiz”:
·
“I can’t believe I ate the whole thing.” [Alka
Seltzer]
·
“I bet you can’t eat just one.” [Lay’s Potato
Chips]
·
“Where’s the beef?” [Wendy’s]
·
“You deserve a break today.” [McDonald’s]
·
“Put a tiger in your tank.” [Exxon]
·
“Cleans like a white tornado.” [Ajax]
·
“What happens in Vegas, stays in Vegas.” [Las
Vegas]
· “America runs on Dunkin’.” [Dunkin’ Donuts]
You’re welcome!
_ _ _ _ _
A Funding Opportunity
My Friends: A lot of my school and district consultation work is funded by (often, five-year) federal grants from the U.S. Department of Education that I write for and with the districts who are interested in implementing my work.
A new $4 million grant program is coming up in a few months that needs a single moderate to large school district with at least 25 elementary schools.
As we can submit multiple grants from different districts, if you are interested in discussing this grant and a partnership with me, call (813-495-3318) or drop me an e-mail as soon as possible (howieknoff1@projectachieve.info).
Another five-year $4 million grant program will likely be announced a year from now. This program will be open to districts of all sizes. If you are interested, once again, it is not too early to talk.
BOTH grant programs focus on (a) school safety, climate, and discipline; (b) classroom relationships, behavior management, and engagement; and (c) teaching students interpersonal, conflict prevention and resolution, social problem-solving, and emotional awareness, control, communication, and coping skills and interactions.
Beyond these grants, if you are interested in my work for your educational setting, I am happy to provide a free consultation for you and your team to discuss needs, current status, goals, and possible approaches.
Again, call me or drop me an e-mail, and let’s get started.
Best,
Howie
[CLICK HERE to read this Blog on the Project
ACHIEVE Webpage]
No comments:
Post a Comment