Making a Case for Online Science Labs

Harry KellerBy Harry Keller
Editor, Science Education
10 November 2008

In my last article, I spoke of states blocking progress in online science education. California and New York proscribe the use of virtual labs for their high school diplomas. Rather than complain about this situation, the online community must find ways to work with the University of California Office of the President (UCOP) and the New York State Board of Regents (Regents) to amend their rules.

There’s much at stake here — too much to waste our efforts attempting somehow to make simulations okay as labs. Realize that if these states modify their rules, then we open up a great set of opportunities for online education.

Instead of beginning by opposing UCOP and Regents, begin where they are and work with them. I read in the UCOP position a statement that no virtual labs that they had seen were good enough to substitute for hands-on labs. Take that as our starting point.

First, make contact with these groups. Then, show them the possibility of using online labs as a part of the instructional process. What’s the best way to make that demonstration?

Because the UCOP and Regents have not seen any virtual labs that they feel are suitable, and they have seen plenty of simulations (data, objects, and phenomena generated by equations and algorithms), do not begin by showing them what they’ve already rejected. Instead, show them something completely different.

keller10nov08Remember that the decision makers are taking their guidance from scientists. I’m a scientist (chemistry) and have some ideas about how these important advisors view science lab experience. Understand that the traditional education community is very protective of hands-on labs. Any solution must include these to some extent. The exact extent should be a subject of negotiation. The College Board, for example, mandates 34 hours of hands-on time for AP Chemistry.

Use America’s Lab Report for guidance and as a possible neutral virtual meeting ground. Showing adherence to all aspects of the report will, I believe, demonstrate the required possibility.

Having established communication and demonstrated the potential for online science to succeed, engage in a dialog regarding any deficiencies perceived by the UCOP and/or Regents in the various presented alternatives. Agree that one or more, if amended, can substitute for some fraction of the total hands-on requirement. Some approach may even succeed without modification.

Overcoming any such deficiencies and presenting our case again will complete the process and open the door for online science instruction throughout the United States.

Our initial presentation should include as many innovative approaches to virtual labs as we can muster and should not include simulations as lab substitutes for the reasons stated above.

I’m aware of three possibilities for presentation. None use simulations. All use the methods of science.

1. Large online scientific database investigation. Prof. Susan Singer, the lead author for America’s Lab Report, uses this approach in her own classes.

2. Remote, real-time robotic experimentation. Prof. Kemi Jona, one of the authors of the NACOL document about online science (together with John Adsit), is working with the MIT iLab people to supply these labs to students.

3. Prerecorded real experiments embedded in highly interactive software allowing students to collect their own personal data. The Smart Science® system is the only known example of this approach. (Disclaimer: I’m a creator of this system.) Apex Learning and Johns Hopkins University’s CTY are just two organizations that use these integrated instructional lab units.

I’d be happy to hear of other approaches that are not simulations and to work with anyone who’d like to see a change in the UCOP and Regents standards for lab experience. I’d especially like to talk to anyone who has contacts with the UCOP or Regents. The sooner we start in earnest, the sooner we’ll succeed.

One Response

  1. Innovation within your classroom is fairly easy. You are in control. You may even be awarded Teacher of the Year.

    Packaging the innovation (marketing) and applying (selling) it for use in other classrooms has required more effort and time than creating the innovation.

    My first approach was to publish as free software, http://www.nine-patch.com. The benefits of Knowledge and Judgment Scoring were so obvious over traditional Right Mark Scoring that teachers would readily import their student answer sheets and reap the benefits of advanced accurate, honest, and fair results.

    Standardized test makers would switch to Knowledge and Judgment Scoring so marginal students would no longer be assigned pass or fail by chance alone.

    Several years later I started a series of videos, January 2008, on YouTube to answer a dozen simple questions. The videos were grouped by student, teacher and administrator. By the end of the year there were 27 videos with another half dozen in preparation including three from the original set.

    In March of 2008 I started http://www.multiplechoicescoring.org to provide an index for the videos and a non-commercial URL. Five of the student and teacher videos have been rated five. None of the administrator videos have been rated. The ratio of viewers has been 10:7:1 for student, teacher, and administrator.

    To establish an innovation, we need to approach each of these groups in a way that communication occurs. By interacting we can learn to understand one another. We do not see the same question in the same way. Often one party sees no problem. In education it is nearly impossible to discuss a topic without first clearly defining even commonly used terms.

    A current example is “test score”. A marginal student’s score on a NCLB test is meaningless for anything other than averaging with other scores for a school ranking. By scoring the multiple-choice tests by just counting the right marks there is no way of knowing what each student really knew. There is no way of knowing the level of thinking involved during training and testing.

    I read the online daily SmartBrief from the Association for Supervision and Curriculum Development (ASCD) and rarely find that “level of thinking” is considered. There may be reference to “a deeper or fuller understanding” without an indication of the operational means of achieving this goal.

    The result is that doing that which gives the appearance of learning can still take priority over “deeper or fuller understanding”. In short, teachers can take the process of question, answer (including errors), and verification that all scholars, scientists, and self-taught students use (a cyclic, self-correcting process; higher levels of thinking) and divide it into a series of lessons that take such an amount of time students are left to believe that what-you-observe-is-what-is-important (a linear, not self-correcting process; lower levels of thinking).

    A good example is the type of online science experience offered by Harry Keller, at SmartScience, versus “cook book” labs that provide the correct steps to obtain a predefined result.

    Without an awareness of the levels of thinking students actually use, not to the ones they are exposed to, identical test scores can have a range of quality. An A in a failing school is not the same as an A from a successful school.

    Knowledge and Judgment Scoring exposes this range of quality. It also exposes what each student can trust as the basis for further instruction and learning. The right answer for a question may turn out, after class discussion, to not be the one on the answer key.

    A long time friend on campus told we this summer, “People are not interested in having this information as it makes them uncomfortable to feel the need to make changes.” Experience in the classroom indicates that it takes just two tests for over 90% of students to voluntarily change from guessing to reporting what they trust. They learn from experience.

    With teachers and administrators I provide free software and support to help them have the experience in their classrooms. I look forward to meeting more on this blog.

    Richard Hart, PhD
    rahart@multiplechoicescoring.org
    Professor Emeritus, NWMSU
    573-808-5491
    President, Nine-Patch Multiple-Choice, Inc
    rhart@nine-patch.com

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s