Author Archives: T.R. Girill

Technical Talk Tips

 

 

Teacher Analysis of
Technical Talk Tips

 

T. R. Girill
Technical Literacy Project
trgirill@acm.org
October 2018 (ver. 2, WP)

Context

The good-description
guidelines
turn general concerns and policies about text
usability
into personal actions that students can take when they draft or revise a
technical description.
Indeed, each guideline item is a command or instruction
(e.g., “organize the part descriptions to help your reader…”)
that prompts some specific writing behavior for students to try
or practice.
The Technical Talk Tips
chart (below) generalizes this same skill-building approach to
address the special constraints
that apply to presenting technical information orally:

Like the good-description guidelines that it extends, the four-problem
talk-analysis here is intentionally very inclusive.
It applies easily to almost any classroom (or current-events)
science talk.
Students who go on to technical careers will find this framework
equally useful when planning or improving technical talks in their
work life (talk usability is a very authentic challenge).
This also means that examples for modeling or practicing the
good-talk techniques on the chart turn up readily in your on-going
science lessons.

Extended Techniques

The Technical Talk Tips chart extends the basic writing guidelines in two ways.

Problems.
This chart explicitly reveals and names for students the four
communication problems that listeners typically have during
a technical talk.
It also connects each added communication problem with responses,
with actions that students can take directly.
As with writing, this makes responding a matter of social responsibility,
of offering usability help for one’s audience, not just flaunting one’s
personal talents.
Contrasting each talk problem (column 3) with its
counterpart in writing (column 2) shows students why some extra
effort from them is needed to cope with the spoken case.
Then, as with the description-writing guidelines, this chart itemizes
the well-known, empirically grounded (sets of) techniques that good
science speakers use to anticipate and mitigate the four talk
problems.
Students can practice these techniques both to prepare
a good talk or to revise a poor one.

Vocabulary.
In addition to posing new problems,
the Talk Tips chart introduces extra (talk-relevant)
vocabulary and distinctions to help students cope with the extra
talk constraints.
Examples of design terms absent from the good-description guidelines
but introduced here in column 3 to clarify speech-oriented
techniques include:

  • scope/depth tradeoff,
  • milestone,
  • topic transition,
  • technical depth,
  • data density.

(The commentary below on the four specific talk problems
explains and illustrates these terms in ways that you can share
with your classes.)

Commentary on Each Talk Problem

Structure (Order: Where Are You Going?)

Problem Technical writing, where the audience can… Technical speaking, where the audience must…
STRUCTURE Read in any order. Listen in the speaker’s order of presentation.
  1. Plan your scope/depth tradeoffs early
    (use either broad and shallow treatment or narrow and deep).
  2. Reveal (overtly summarize) your talk’s structure
    (the audience cannot see your mental outline or table of contents).
  3. Announce structure milestones as you pass them
    (use verbal headings [“the third problem…”] and
    proleptics [“by contrast…”]).

The advice (in the right-hand column here) to clearly
reveal each talk’s structure and milestones
(with a spoken summary or with an introductory
“table of contents” slide) appears in every book on how to give talks.
And the implicit comparison of hearing a talk to taking a trip
(oriented with overt milestones) is straightforward.

But
you will probably need to explain to students psychologist Donald Norman’s
related tip about “scope/depth tradeoffs.” Shopping provides a familiar example:

  • When looking for wire at the hardware store, you face a “narrow and
    deep” situation–you can ignore all aisles except for the one marked “electrical”
    (narrow), but then you have to face many specific “deep” choices (copper
    or aluminum, coated or uncoated, braided or solid).
  • When shopping for ice cream, however, your situation is “broad and
    shallow”–31 flavors confront you, but then you only have to pick one scoop
    or two.

To succeed psychologically, explains Norman,
a good technical talk must follow one of these
same two comfortable scope/depth alternatives.
Broad-ranging talks cannot get too detailed,
while deeply detailed talks must stay narrowly focused.
Student speakers need to help their audiences by becoming aware of
these two alternatives, picking one (for each talk), and then
sticking with it when they arrange their talk’s features.

Review (Rereading: Where Are You Now?)

Problem Technical writing, where the audience can… Technical speaking, where the audience must…
REVIEW Reread any passage. Rely on the speaker to repeat if appropriate.
  1. Remember the slogan “tell them what you’re going to tell them,
    tell them, then tell them what you told them.”
  2. Identify and manage your topic transitions
    carefully, usually with planned repetition of structure cues.
  3. Practice to avoid pointless, accidental repetition.

In a movie, viewers can easily tell when the action shifts from scene to
scene. Likewise, the audience of a technical talk needs to easily tell when
the speaker shifts from topic to topic. Student speakers therefore must
learn to help their listeners topically by paying attention to their own
“topic transitions” and
clearly signaling the audience about them.
Using a personal
story board, perhaps constructed of one sticky note per (sub)topic
pasted on a wall or large sheet of paper for easy rearrangement,
is the standard,
Disney-pioneered way of mapping out the “scene changes” (topic transitions)
in a technical talk (so that the speaker can be sure to disclose them
to their otherwise confused listeners).

Understanding (Complexity: What Do You Mean?)

Problem Technical writing, where the audience can… Technical speaking, where the audience must…
UNDERSTANDING Study and gradually understand. Understand on the first hearing.
  1. Choose your vocabulary, examples, and comparisons to control
    your technical depth.
    Adjust to suit:

    • Your audience’s background.
    • How your talk unfolds.
  2. Manage your data density:
    • Control your amount of supporting detail.
    • Supplement your talk with detail-bearing handouts
      (references, for example).
    • Use visual aids (slides, models) to carefully increase
      data density without increasing confusion.

Technical talks are much more dense with data than is ordinary
speech. Most people can absorb dense information more easily by seeing it
than by hearing it only. So most good technical speakers supplement their
spoken words with things that their audience can also see: exhibits, handouts,
pictures, summaries, or slides.
Student speakers, therefore, need to make two separate
choices about data density:

  1. how much information to present per minute or per
    slide, and
  2. how to manage that density level with visual aids
    (especially technical slides).

For very discussable, visually clever, gender-neutral examples of data-dense
handouts or charts, visit http://www.cookingforengineers.com.
This site diagrams dozens of kitchen recipes to concisely reveal which
tasks to perform on which ingredients in which order.

If you wish to tap just one of the many published slide-advice
discussions with high relevance to science class, consult Edward R. Tufte,
Visual Explanations
(Cheshire, CT: Graphics Press, 1997, especially pp. 38-53).
Effectively managing dense science data with astute slide graphics
is one of Tufte’s major themes.

Delivery (Presentation: What Did You Say?)

Problem Technical writing, where the audience can… Technical speaking, where the audience must…
DELIVERY Read at any pace. Listen and absorb at the speaker’s pace.
  1. (Before talking) spell out your list of claims
    (to confirm just
    how many claims you have).
  2. Rehearse privately:
    • With your notes (to tune self-prompts).
    • Before a mirror (to practice eye contact).
    • With a clock (to check pace and length).
  3. Avoid saying one thing and showing another
    (plan and practice speech/slide coordination).
  4. Maintain audience interest:
    • Use short, direct sentences.
    • Show (appropriate) enthusiasm.
    • Attend to audience needs (confused? can’t hear? questions?).

No matter how thoughtfully a speaker plans the structure, topic transitions,
and data density for their technical talk, they still must actually deliver
it aloud to their audience. The tips on the chart for this delivery
problem are rather more specific than for the other three problems, and
of course students must physically try them out through personal practice
to benefit from them. You can helpfully model these tips by presenting from
a few sample slides, since slide/speech coordination often defeats beginners.
Sticky-note prompts placed on the slides themselves can help nervous speakers
manage the rough spots. “Casing the room” is another helpful behavior that
you can model here–checking the lights, projector alignment and focus,
and sight lines from the corners and rear–before launching into
the prepared talk.

All of these talk-support suggestions are very authentic. They will help
your students not only in real technical/professional venues, but also in
courtroom settings where forensic analysis is part of a case.

Poster Design for Technical Literacy

 

 

Poster Design as a Missed Opportunity for Technical Literacy

 

T. R. Girill
Technical Literacy Project
trgirill@acm.org (rev. 4)
October, 2018 (WP)

Handbook TOC

[This analysis is planned around six selected example
posters and comparison poster-like cases, all available online.
Each is cited and linked in the text, and the URL for each
appears in the reference list.]

The Problem

Posters (and their close kin, classroom bulletin boards) are
too often wasted as an opportunity to learn authentic lessons
about technical literacy.
Good students manage to do them fairly well, poor students
blunder along, but nobody gains many strategic insights for
real life from making posters.

Aside from just turning students loose, the usual default
approach refers students to collections of practical tips
and templates, such as these:

Colin Purrington,
Designing conference posters,
covers tips, examples, templates, even software(concise yet very useful and thorough)
http://www.colinpurrington.com/tips/poster-design
Jeff Radel,
Designing effective posters,
University of Kansas Medical Center website
(29 pages, also good
but more verbose)
http://www.kumc.edu/SAH/OTEd/jradel/Poster_Presentations/PstrStart.html
Jan Pechenik
A
short guide to writing about biology, 6th edition

(New York: Pearson Educational, 2007), pp. 244-250
(7 pages, excellent concise review but lacks details or examples).

These are certainly helpful, and sometimes humorously revealing,
sources. But they tend to be ad hoc.
They leave poster design isolated from both general
usability principles
and from other kinds of scientific communication.
With just a little extra work and a wider perspective, however,
poster making can reinforce (or even introduce) the importance
of effective nonfiction communication techniques for working
scientists.

An Alternative Approach

To maximize poster making as a text-design learning activity,
one needs to overtly contrast posters with virtually all other
typical science communications.
Posters need high usability, but even when (or perhaps
especially when) they have it they are not good
models for other science publications, and vice versa.
A good short technical article blown up to poster size does
not make a good poster (e.g., Dunn, 2008).
And a good poster shrunk to (two-page) article size seldom
makes a good technical article (e.g., Mason, 2006).
So more self-conscious awareness of how technical posters are
unusual is vital if students who make posters are to
learn about (or reinforce what they have already learned about)
good-writing guidelines and text usability principles from
this activity.

Three Different Constraints

Think of a science poster as a
text engineering 
project, a
design effort like any technical writing, except that most of
the design constraints on a poster are opposite from
the usual ones.
Three important contraints illustrate the consequences of this
approach.

(1) Grain Size

Science articles present rich, fine-grained analysis, which
their writers must manage in detail.
Science posters, however, present mostly sparse, coarse-grained
information.

Consider the good-description organizational
guideline that you should

Divide a described object into parts,
describing each part
sequentially in detail and
revealing its relation to the
other parts.

For articles, this is a way to manage content,
offering information piecewise so that the audience can handle
its complexity.
For posters, this is a way to manage real estate, the
huge surface that confronts the audience but is usually viewed
from a distance.
The carefully crafted paragraphs that build explanations in
a good article are not enough to yield a good poster (as the
“posterized” Dunn
article from Science shows).
Here instead the poster designer needs to choose and deploy
obvious, large-scale visual chunks, such as boxes, bubbles, or
other framed colored zones (as in Mason, 2006).

(2) Amount of Text

Science articles contain lots of words, often as many as their
authors can squeeze into the allotted pages.
Good posters, even on complex technical topics, are word sparse.
This sample-poster comparison chart shows that while even a
two-page Science article (which can be printed large
to create a fake comparison “poster”) contains 1550 words,
typical well-designed technical posters usually contain only
one-third to one-fifth as much text:

Sample Posters Compared

Alternative Samples Properties Affecting Usability
Words(*) Pictures Headings Path
(A) Science 2-page Dunn article,
“posterized” for comparison
1550 0 0 linear, but no visual cues
(B) USGS topographic map (part),
enlarged for comparison
1 0 completely flexible
(C) Mason “Microarray”
(award-winning) process poster
375 8 9 numbered cues
(D) Green “Metacarpal”
specimen or artifact poster
510 12 10 text cues


(*)Excludes heads, references, captions, and acknowledgments
(if any).
Consider the good-description content
guideline that you should

Omit irrelevant background and confusing detail to focus
attention on relevant technical features.

For articles, the threshold of relevance for this
guideline is fairly low.
Article readers tolerate, even demand, an elaborate characterization
of your methods, for example, so that they can study them for
flaws or try to reproduce them in other settings.
Discussions (of results) are often intricate for the same
reason.

For posters, however, the threshold of relevance is quite high.
Only the most significant aspects of a project or process can earn
space in a poster’s large-print display.
Almost all method and discussion detail must be sacrificed
here. Space constraints (amplified by the reading-time
constraints of the poster’s moving audience) require that only
the best results, the most innovative strategies, and the
boldest interpretations can appear and consume some of the very
limited supply of poster words.

(3) Use of Figures (Graphics)

In articles, text dominates graphics, even when the figures reveal
key trends or relationships.
On posters, graphics dominate text, even though text is vital
to assert key claims or explanations.

Consider the good-description signal
guideline that you should

Integrate figures and text with labels and references
for coherence.

Once again, this plays out differently in the context of articles
and posters.
In articles, readers expect careful textual analysis and
argument, and they can invest the time to follow them.
Graphics (figures) support and clarify the text. This support, however,
can be “deep”–some readers will measure pictured objects or
plot extra lines on graphs to interpolate extra values, just as
they would rely on and study the details of a topographic map
(e.g., USGS, n.d.).

On posters, the graphics replace most of the text,
out of necessity.
And they play not a map-like role, but rather one more
cartoon-like: bold and thematic rather than intricitately
subtle (that does not mean inaccurate, but differently
accurate).
Heads, arrows, and large-scale diagrams or tables must do the
work that text performs on paper, because these are usually
the only poster features that viewers can access from across
a crowded room.
Nobody could read a textual alternative, nor annotate it for
study if they could read it.
As the picture, heading, and path columns in the “Sample Poster”
chart (above) reveal, the count of visual features is both
relatively and absolutely high for well-designed technical posters.
These graphic features are thoughtful but simplified substitutes
for the (appropriately) missing text.

 

Summary

The same text usability principles apply to science poster
design as to paper planning. Nevertheless, the results should look quite
different because most of the constraints on design, especially the
(a) publishing goals, and
(b) reader needs
differ greatly between the two cases.
This chart summarizes the shared framework as well as the
divergent results:

Articles and Posters Compared

Journal Article Technical Poster
Publishing goals:
Structure Formal
(often IMRD standard)
Flexible
(easy, appealing path)
Text
–amount
–style
–details
Many words
Provides careful analysis
Rich enough to replicate
Few words
Delivers vivid highlights
Strategic, focused for interest
Graphics
–model
–data density
–prime role
Map (information in layers)
High (often tables)
Visual explanation
Cartoon (bold key relationships)
Low (often plots, pictures)
Visual interest
Reader needs:
Overview Abstract: a summary meaningful apart from the article Brief introduction: an appealing gateway to the poster
Reading
–distance
–context
–speed
–annotation
Close (inches), attention to detail
Quiet and private
Slow, thorough review
Yes, frequent
Far (+6 feet), glanced across room
Noisy and social
Quick survey
No, impossible

References

Dunn, E. W., L. B. Aknin, and M. I. Norton. (2008).
Spending money on others promotes happiness. Science,
319 (21 March), 1687-1688. URL:
http://dx.doi.org/10.1126/science.1150952
Green, D. J. and A. D. Gordon. (2006).
Metacarpal proportions in Australopithecus africanus [Poster].
Center for Advanced Study of Homonid Paleobiology.
URL: http://phdposters.com/galfs/o753_p1_poster.jpg
Mason, M., X. Ding, and J. S. Dorick. (2006).
Development of high-throughput microarray for evaluating
CYP1A1 induction [Poster]. Rensselaer Polytechnic Institute.
URL: http://mms.technologynetworks.net/posters/0465.pdf
USGS. (n.d.)
Sample USGS topographic map.
URL: https://en.wikipedia.org/wiki/Topographic_map#/media/File:topographic_map_example.png

 

Designing Effective Abstracts

 

 

Effective Abstracts in Science Class

 

T. R. Girill
Technical Literacy Project
trgirill@acm.org
Rev. October, 2018 (WP)

Core Characteristics and Skills

Abstracts are one of those rare literary forms important in real
life as well as just in school (more on that at
the end).
An abstract summarizes a paper (or, in real life, a published
article) in 250 to 500 words, but in a special way.
Abstracts circulate widely independently of the articles
that they summarize (in vast searchable online databases sold
commercially or offered by professional societies).
Hence astute abstract writers have learned the importance of
constructing a compact surrogate for their much larger paper.
A good abstract is brief and entirely self-contained, yet someone
who has never seen the paper that it summarizes should be able
to reliably judge the content and relevance of that paper
(to their own interests) by reading the abstract alone.

Abstract writing thus demands sophisticated text design skills:

  • conciseness together with
  • a well-honed sense of content priorities, and
  • disciplined verbal organization.

So teaching these skills to science students presents special
challenges as well as unusual practical opportunities.

Field-Specific Features

Most high-school students learn about abstracts (first) in
language arts or history class. So what does a good authentic
abstract look like in language arts or history?
That is an empirical question, which Helen Tibbo answered in her
revealing comparative study “Abstracting across the disciplines”
(1992).
Tibbo performed a sentence-by-sentence analysis of 30 abstracts
randomly selected from a “high-impact” (influential, widely
cited) American history journal and compared them with 30
abstracts randomly picked from a high-impact journal in chemistry
(p. 43).

The history abstracts had this content profile:

Percentage of sentences
in each content category

Categories
History
Background 5
Purpose/scope 9
Hypotheses 1
Methods 5
Results 2
Conclusions 14
None of above 64

Most real-life abstracts in science and engineering, however, are
very different from this.
Technical abstracts are highly structured and divide their words
very consistently by task.
Tibbo’s analysis of chemistry abstracts, for example, found
this pattern:

Percentage of sentences
in each content category

Categories
Chemistry
Background 9
Purpose/scope 24
Hypotheses 0
Methods 28
Results 22
Conclusions 16
None of above 0

In other words, science abstracts strongly reflect the IMRD
(introduction [= background], methods, results,
discussion [= conclusions]) structure of most science reseach
articles, while (even well-written) abstracts in other fields
do not.

Few students draft science abstracts consistent with this
“chemistry” pattern without special coaching.
A likely explanation is that they learned how to craft abstracts
from history or language arts teachers mostly familiar with the
upper (unstructured) pattern rather than the more structured
lower pattern typical in science publications.
The drafting sophistication required together with the highly
structured division of the text means that your students will
probably learn how to write good science abstracts only from
their science teacher (if they learn it at all).

Structured Abstracts

Because the structure of science abstracts is so consistent and
useful to readers, a few technical journals–mostly in medicine
where confusion or ambiguity can have drastic negative clinical
consequences–now make the IMRD chunks of their abstracts
explicit with imbedded headings.
Your students can see other students using this approach
(misleadingly call “structured” abstracts instead of
“labeled” abstracts) at the California State Science Fair
(CSSF) website (go to
http://www.usc.edu/CSSF/History
for a choice of recent years).

Heading details vary by publication (or year) but the usual
pattern involves overtly subdividing the abstract text with
these labels:

Objective/Goal(s)
Materials and Methods
Results
Conclusion(s)/Discussion

Even these slight prompts are seldom deployed, however.
Most professional journals still leave to the scientist (or engineer)
the task of rationing their scarce abstract space among the
IMRD topics to adequately represent the work done and
signaling the abstract’s organization without the help of
any displayed headings.
Few students do this well even if their lab or field work is
strong and interesting.

A Content Checklist for Abstracts

Giving students examples of well-crafted and of ineffective
abstracts is the usual practice to encourage good drafting.
Unfortunately, this assumes that the students can learn from
examples, always a cognitively demanding task and especially
so when the content distinctions are subtle.
Even poor abstracts are often full of “science content,”
but that content is inappropriate to thoughtfully represent
the corresponding paper or report.

Another way to scaffold abstract drafting is with an itemized
list of what science abstracts include and exclude:

When constructing your abstract…

Include
Exclude
Your topic or problem (purpose),
but usually not your hypothesis
Background information (except to briefly frame the problem)
Scope of work Historical details
Treatment (experimental, theoretical, methodological, practical) Literature review, cross references, footnotes
Novel methods or algorithms
(but always age, sex, genus, species of biological subjects)
Procedural details, apparatus diagrams
Key numerical or statistical results Equations, formulas, data tables, graphs
Significance; interesting conclusions General principles or trends, common knowledge
“Keywords”–searchable terms, distinctions, comparisons that
identify your work to others
Definitions of technical terms

This chart offers a chance to not only tour the important features
for your students, but to explain why they fall on one side
of the chart or the other.
For example, school science projects often stress testing some
explicit hypothesis, but (as Tibbo’s study confirms) virtually
no published abstracts include this.
Instead, they make clear the purpose of the reported work,
the specific research or design problem addressed.
Readers naturally filter abstracts looking for those whose
reported problems link to their own.

Since students spend much time mastering techniques (“methods”)
new to them but standard in a field, the focus of their work is
often strongly methodological.
Projects with a theoretical or therapeutic focus should point
this out.
Biologists always filter the work of others by organism
studied (hence the mention of subject genus, species, sex,
and age).
And every reader wants to know how a project turned out–the key
quantitative results and their significance from the author’s
perspective.
Abstracts are searched in large databases by keyword, so
smart abstracters take care that their vocabulary contains
those terms and distinctions that others are most likely to
use during such a search.

The exclusion column surprises some students because most items
listed there seem obviously relevant and important.
But an abstract needs to devote its scarce space to the
work at hand, not to general science principles or background
information readily available from reference sources.
Almost all of the other exclusions reflect the text-only
character of abstract databases: diagrams, equations, graphs,
or tables are not plain text (in journals they are often
encoded PDF or TeX) and they will not survive sharing in
plain text formats.
Astute writers learn ways to mention or summarize these
otherwise valuable features using text alone (in retrieval-friendly
ways).

A Science-Abstract Template

More Space Scaffolding

Explicit as the foregoing checklist is, it still leaves many students
confused about how to budget their scarce space in a science
abstract and how to implement the checklist’s content advice.
Those students need a further scaffold of a different kind when
they draft an abstract:
a top-down guide to space allocation.

I find that this template or “action matrix” meets that need well
(click on the chart for a printable PDF version):

 

Action Matrix for Building a Good Science Abstract
(1)
Allot roughly this percentage of total WORDS:
(2)
To describe this TOPIC:
(3)
Then review the draft and REVISE it to
be sure that:
–These subtopics are included (if relevant).
–These questions are addressed.
24% Purpose: what we sought
  • Goal(s) clear and unambiguous?
  • Research/experimental design explicit?
24% Methods: what we did
  • What was actually measured, and how?
  • Any new or improved methods?
  • Steps or sequence clear?
  • Which controls used? Comparisons?
  • Risks, attrition, adverse events disclosed?
  • Subject selection, randomization, limits?
  • Blinding of subjects? Investigators?
24% Results: what we found
  • Key numerical/statistical trends revealed?
  • Outcome assessment well defined?
  • Sample size adequate?
  • Calculations or analysis appropriate? Tests stated?
  • Explains if no results (yet)?
24% Conclusions (discussion): so what
  • What is omitted that could mislead a reader?
  • Terminology inaccurate, inappropriate, needlessly hard?
  • Text too condensed or confusing (for ESL readers)?
  • Disorganized or wrongly organized claims?
4-5% Background: place first but
draft last so you don’t waste too many words here
  • Focused on problem-framing context?
  • Acronyms explained, jargon minimized?

 

The matrix assigns to each of the four key topics that a good
science abstract includes (IMRD, col. 2) roughly
one-quarter of the available total words (col. 1).
Although a little “background” comes first, I advise students
to draft it last, using a sliver of space saved from each of the
other topics.
Software makes word counts and percentage calculations easy for
anyone now, so students can give themselves simple quantitative
feedback as they iterate toward this division of space.

Of course, student projects are often not well balanced among the
four IMRD topics–results (or interpretive conclusions) may be
light while (newly learned) methods are disproportionately
important.
So rebalancing the division of space from the even four-way split
suggested here can be quite appropriate: the template still
provides a disciplined framework from which to start.

More Content Scaffolding

A somewhat different problem is that some students are intimidated by
an abstract’s space constraints and include too few details about
any of their specific work.
An abstract is no place for vague generalities, however true they
are.

The template’s third (right) column addresses this problem.
Here for each topic are a few focused questions to prompt
students to include relevant details.
(Not all questions pertain to all projects, of course, but if they
pertain their answers are very interesting to technical
readers eagerly searching for work related to their own.)
The methods row contains the most questions because
(1) students often carelessly overlook their own key
methodological details, and
(2) methods refinements are often more innovative than
results or conclusions in student-level research.

Using this template along with the include/exclude itemized
chart more reliably yields good abstracts than using the
latter alone.
Template cols. 1, 2, and 3 together more thoroughly externalize
the drafting actions that students need to try.

The Authentic Payoff

 

One really satisfying aspect of teaching students how to design
effective science abstracts is that they are avidly read by
real scientists, not just by teachers grading them in science
class.
And reading such abstracts often has significant practical
consequences.

Henry C. Barry and his colleagues reported a particularly
striking example of science abstract influence in their 2001
article in the Journal of the American Board of Family
Practice
.
Barry et al. asked almost 300 family-practice physicians how
they would treat corneal abrasions (e.g., patch or not) and
how they would treat fibromyalgia.
Then these physicians read only abstracts, not whole
research articles, summarizing briefly research done by others
relevant to those conditions.
After reading the abstracts alone, without consulting
the nuanced subject-control discussions and applicability
warnings in the original articles, 76% of the physicians were
willing to change their treatment of corneal abrasions and 73%
would change their use of drugs with fibromyalgia patients.
Because of time pressures in clinical practice, these doctors
routinely scanned abstract databases and adjusted their
therapeutic behavior based only on 300-word summaries of
medical research done elsewhere by complete strangers.

So unlike many things that your students learn (or should learn)
to write in school, science and engineering abstracts have
a high impact on other practitioners in real life.
Introducing students to checklists and templates that promote
abstract effectiveness may therefore increase their influence
on others during their professional lives much more than
many other superficially cogent lessons or techniques.
Abstracts are small but mighty.

Middle School Science Abstracts

Teachers sometimes ask if students can or should learn abstract-drafting
skills before they need them in high school.
Does designing effective abstracts have a place in middle school?

The Need to Learn

One positive answer comes from the Common Core State Standards
(CCSS)
for Language Arts (literacy). CCSS uses the word ‘summary’
rather than ‘abstract’ to identify a concise yet revealing nonfiction
description of a project or a larger technical text.
According to CCSS, starting in grades 6-8, active reading should
involve creating “an accurate summary of the text [read] distinct
from prior knowledge” (RST6-8.2).
Literacy standard W6.5 likewise calls out the skills that
middle-school students need to generate such summaries:

With some guidance and support from peers and adults
[such as an age-appropriate scaffold for designing abstracts,
see below],
develop and strengthen writing as needed by planning, revising,
editing, rewriting, or trying a new approach.

A second positive answer comes from regional science fairs, which
usually accept projects from students in grades 6 to 8, with an
abstract for each one.
The California State Science Fair, drawing upon local fair winners,
even publishes all of its grade 7-8 project abstracts on the
website for its junior division
(linked by topic from

http://www.usc.edu/CSSF/Current/Panels
).
One of those student-written abstracts seeds the weaker/stronger
comparison case discussed below (and these examples are available
for every topical area in science).

Content versus Grammar

When coaching middle-school students in writing abstracts, you need
to make clear that critiquing and revising an abstract (a literacy
effort) is different from critiquing or revising the underlying project
described (an effort in technical methodology).
Indeed, some students with clever projects fail to do their own work
justice with inept abstracts that inadequately reveal their
accomplishments to strangers.
Most challenging are abstracts that appear grammatically and
syntactically correct but still do not explain what the student
actually did, found, or thought. A checklist of questions and
weaker/stronger example comparisons
(see below) can help such students develop
a more sophisticated sense of what makes a science abstract
helpful to others.

A Case Study

As a extended example of revising a middle-school abstract to
make it more effective using Common Core literacy skills, consider
this (anonymized) case written to summarize a grade-7 student
project and published on the California State Science Fair site
for 2014:

Middle school sample abstract

The Section Headings

CSSF abstracts always contain explicit section headings, certainly a helpful scaffold for students new to drafting abstracts. Most professional journals omit such subheads, but a few trend setters such as the Journal of the American Medical Association (JAMA) now include them to help their own physician-readers more effectively use each abstract’s content.

Objectives/Goals

Here (see text box above) is a typical instance of completely readable sentences that nevertheless reveal almost nothing about what the student tried to do. What kind of “vegetable” oil? What kind of “plant”? How would they know if the plants really “grew better”?

Consider this possible alternative goal statement:

This project explored whether giving primroses a mixture of olive oil and water instead of water alone increases leaf size.

The revised version is only half as long as the original (20 words instead of 40), yet it includes all of the helpful goal-specifying specifics about this project that the first version omits.

Methods/Materials

The original version (above) gives some details but remains vague (“watered”=?), mysterious (where did the student get the 2.5-ml amount?), and incomplete (how was growth measured?). Including sentences such as those below could address those methods gaps:

I poured a mixture of 250 ml of tap water and 2.5 ml of olive oil into each pot weekly, an amount estimated from liquid fertilizer instructions. I measured the tip-to-tip horizontal size of each plant’s leaves at their widest location with a ruler.

Results

This abstract section summarizes the student’s “average” data, but it fails to reveal its spread (standard deviation for older students) nor say what really counts as plant “growth” (height? width? weight?). Revision along these lines would help:

Water-only plants increased their maximum leaf width on average by 0.85 cm (ranged from 0.6- to 0.9-cm increase) while water-plus-oil plants increased their maximum leaf width on average by 4.57 cm (ranged from 3.5- to 5.5-cm increase).

Conclusions/Discussion

The student in this case understands that interpretation is needed but doesn’t quite know how to explain that “plants will grow better if vegetable oil is added,” nor why their project results could have practical significance. One stronger direction to try would be:

Olive oil contains hydrocarbon molecules (fatty acids) that green plants use as extra nutrients. This could also be tried on food crops.

Tools to Teach With

Different project abstracts in different science areas certainly call for different revisions. But a “structured walk-through” of a case like the one above can reveal possibilities that your young students may never notice without such writing apprenticeship.

One way to share such advice consistently in a structured way that students can review and generalize is with a weaker/stronger checklist organized by the parts of an abstract, such as this chart :

abstracts.midsch2

One step more general is a simplified version of the standard
good-abstract design template
adapted for younger students.
Simplified questions and an easier vocabulary bring these
authentic adult issues to bear on middle-school summary drafting
and revision:

abstracts.midsch1

Just teaching young students that the best abstracts are almost
always created iteratively, with several revision cycles looking
for the kind of increased focus illustrated by the case above,
will greatly enrich everyone’s summary-writing skill.

References

Barry, Henry C., et al. (2001).
Family physicians’ use of medical abstracts to guide
decision making: style or substance?
Journal of the American Board of Family Practice,
14(6), 437-442.
http://www.jabfm.org/content/14/6/437.full.pdf
Tibbo, Helen. (1992).
Abstracting among the disciplines. Library and Information
Science Research
, 14(1), 31-56.

 

Good Descriptions Guidelines

 

Guidelines
for Writing Good Descriptions

Organization

  • OVERVIEW. Begin with a brief overview that reveals the object’s
    (a) overall framework, arrangement, or shape, and
    (b) purpose or function.
  • PARTS. Divide the object into parts and describe each part
    (a) in enough detail to use, make, or draw it, and
    (b) in a way that reveals its role, its relation to other parts.
  • ORDER. Organize the part descriptions to help your reader:
    (a) spatial order (top to bottom, outside to inside), or
    (b) priority order (most to least important), or
    (c) chronological order (order of [dis]assembly).

Content

  • SPECIFICS.
    • Include relevant specific features (such as size, shape, color, material, technical names).
    • Omit irrelevant background, confusing details, and needless words.
  • COMPARISON. Compare features or parts with other things already familiar.
  • CONTRAST. Contrast properties with different ones to reveal their significance.

Signals for Your Reader

  • FORMAT. Clarify your text with:
    • Heads. Identify topics with clear, nested section headings.
    • Lists. Itemize related features with indenting and marks.
    • Figures. Integrate figures and text with labels and references.
  • VERBAL CUES. Guide your reader’s expectations with:
    • Parallelism. Use parallel words and phrases for parallel ideas.
    • Proleptics. Use verbal links (also, but, however, etc.) to signal how your description fits together.

 

Good Instructions Guidelines

 

Guidelines for Writing Good Instructions

Organization

  • Are the steps presented in the order in which they are performed?
  • Is the first step really the first task that someone in the audience needs to do?
  • Should a long or complex step be broken into smaller parts?
  • Are all hidden steps made explicit?

Clarity

  • Is each step written as an overt command (beginning with an action verb)?
  • Are all steps easy to find and visually distinct?
    • List format?
    • Bullets or numbers needed?
  • Is each step precise and complete enough to be followed?
    • Includes needed details?
    • Excludes irrelevant text?
  • Are common problems covered?
    • Safety/danger warnings?
    • Troubleshooting tips?