Design Evaluation for
The Climate Time Line Information Tool
Evaluated by: Lucie Sommer in May 2002
an evaluation of the existing Climate Time Line proto-type by Instructional
Design expert Lucie Sommer who assessed the website from the perspective
of how well it serves as an instructional tools and what areas of the
site need to be improved in order for it to become a more effective
Pisik, G.B. (1997). Is this
course instuctionally sound? A guide to evaluating online training courses.
Educational Technology, 37 (4): 50-58.
Smith, P.L., and Ragan, T.J.
(1999). Instructional Design. New York: John Wiley and Sons,
explanation of evaluation ratings and preliminary recommendations
As we talked about in our first meeting, this is one of the
areas of design that needs significant attention. Exploring
the tool with an eye towards understanding its current purpose,
I find a "database" of climate related resources, organized
by a power of ten time scale. The learning context in which
this database might be employed has not been defined nor has
the intended audience.
of the contents in the climate information database seem to
be didactic presentations of specific climate topics, supported
by corresponding graphics and linking to other web-based, didactic
presentations. The tool provides a rich array of climate data
for potential teaching/learning activities, but does not really
suggest what these activities might look like. The one exception
to this are the questions included in each resource section
(titled "Inquiry ???"). These areas of the tool begin to define
simple learning activities, employing the tool's climate data.
These will be made more effective as you better define your
teaching/learning audience, contexts and objectives. Depending
on how you all decide to go with this in the future, you may
want to create two distinct, yet related tools (or two distinct
sections of one larger tool)--one tool that provides the information
database, and another that supports effective instructional
integration of this data.
Does the tool provide clear information about the its purpose?
I found a couple of
statements aimed at defining the purpose of the tool:
"The Climate Time Line
Information Tool (CTL) ... a tool for exploring the complex world
of climate science and history."
"The Climate Time Line
Information Tool is designed as an interactive matrix to allow
users to examine climate information at varying scales through
you hone in on a particular user group and tighten your learning
objectives for the tool, your statement of purpose should also
be changed to be more specific.
Is the tool aimed at a specific instructional context?
Does the tool target a specific user population?
See above. When you
analyze your population it will be important to consider not only
their educational profiles (including both student and faculty
member needs and assets) but also their systems profiles (as in
what type of technology will they be using and what type of human
support is in place to support their use of the tool).
Are the overall teaching/learning (T/L) objectives for the
Are the specific T/L objectives for individual content units
I thought it might
be helpful to apply the general comments above to a specific example.
I chose the drought "tutorial" as a place to explore. As with
many of the site's pages, there was a rich array of content presented
(both on your site's pages and on the linked sites), but I was
very unclear exactly where to focus my attention or what I was
supposed to be learning. I looked for these expectations to be
spelled out in the "Introduction" section, but did not find anything
like that there. The lack of clear learning objectives made it
tough for me to effectively access/benefit from the information
presented. For instance, in the "Drought history" section, there
are links to all the CTL time scale pages along with several others.
I found myself wondering, how much of the linked pages I ought
to explore and what exactly I was supposed to be learning by visiting
these pages. Without clear expectations I went looking for anything
I could find about drought. In order to do this, I had to sort
through a tremendous amount of extraneous information. I was quickly
frustrated. After "wandering" through a myriad of drought related
information housed within the various sections of the tutorial,
I was then presented with the task of applying my learning in
the "Drought data inquiry" section. Not having been clear on exactly
what I was to be studying in previous sections, I performed poorly
on the self-test. I had put energy into my exploration of materials,
but apparently not in the correct areas. This was also frustrating.
Generally, I wanted more clear information about the mileposts
on my learning journey and also about my ultimate destination.
I do like the idea
of theme-based learning, however, and think that the site has
great potential to support that kind of approach. Revising the
site's content organization system and clarifying LO's will be
important first steps in being able to effectively support theme-based
Do the T/L objectives meet the needs of the targeted users?
If there is a content introduction or overview, is it clear
I located content intros/overviews
for the CTL tool itself in the first two paragraphs of text on
the home page and on the CTL Overview page. In the first paragraph
on the home page, there is text in parenthesis as well as text
that links to other web pages--these both greatly distract from
the introductory purpose at hand. The text in the second paragraph
is generally clear and useful, though as I mentioned above, it
may need to be more focused as the tool's purpose is clarified.
The first paragraph
of text on the CTL Overview page is also generally clear and useful.
The table below this text, however, seems to duplicate the navigation
bar on the left of all screens as well as the summary information
that can be found in the "Climate Summary Timeline" on the first
page (please see comments below about this summary information).
Generally, I felt that this table includes too much information
to be useful as an overview tool. What users are looking for in
an overview section are concise/at-a-glance aids to help them
quickly understand what is coming or what is available for more
I also looked closely
at the intros/overviews for the two organizing topics of the tool
"Climate Science" and "Climate History". I located these in the
2nd paragraph on the homepage, in the Overview pages for each
of these topics and in the titles that introduce the pages related
to these content areas. The definitions for Climate Science and
Climate History were highly variable throughout the tool. I found
For Climate Science,
definitions included: "climatic processes", "investigating cycles
and systems", "investigating climatic and environmental processes",
"Climate Science provides insight into how scientists use data
to study climate variability."
For Climate History,
definitions included: "specific climate events of the past", "exploring
climate events and human development", " Climate History looks
at the past 100,000 years of human existence and explores specific
climate events which may have challenged or changed human activities."
It will be important
to tighten these concepts as you expand the tool in the future.
If there are content summaries or reviews, are they clear and
The summaries that
I examined most closely were those that summarized the concepts
of Climate Science and Climate History. I located these on the
home page as rollover text associated with the Climate Summary
Timeline, in the table on the CTL Overview page, and on each of
the Overview pages for these two concepts.
When I looked at the
summary text for the Climate Summary Timeline, what I noticed
is that the summaries did not contain parallel information. Moving
from left to right on the timeline and reading the summaries for
Climate Science, I found information about how weather variability
is scaled, what plays a role in variability, what gives rise to
climate cycles, how variability can be tracked and analyzed, what
influences variability, how data can be used, and temporal term
For Climate History,
the summaries (again, from left to right) focused on what provides
info about variability, the impacts of climate variability on
human activity, the impacts on socio-ecological conditions, the
relationship between climate and people, the description of climate
events of the period (for two periods), and what data informs
variability. I found the lack of consistency in these summaries
The same was generally
true for the other areas in the tool that summarized these concepts.
Tightening learning objectives in future efforts will help to
better focus these summaries and to make them more useful. Taking
a careful look at the diverse learning foci that are outlined
above ought to be helpful in that process.
Does the tool's organization/navigation system enhance the
I found the navigation
bar on the left of the screen to be quite useful and worthy of
the important space it occupies. On the other hand, the navigation
options at the top of the screen occupy some prime real estate
on each page, but do not seem to merit it. There is probably a
less prominent place where they would more effectively reside.
Also, they don't seem to be parallel, in terms of logical hierarchy.
This is also true for the links at the bottom of the page. These
top and bottom areas of the page should be reserved for links
that are used again and again.
Also, a word about
the "tutorial" title used in the top of screen navigation bar.
commonly the title
(and often the location) used for an overview screen about how
use the site. I went
there looking for this kind of information and was very confused
by what I found.
When you have a better
idea of your user population, you might want to think about the
various ways that they might want to access the site's information.
Some kind of index page that organizes by content topic is useful
for a site this large and often pairs well with academic user
needs. A search engine is another thing to consider.
Does the overall organization/navigation system employ logical
The time scale framework itself is a useful organizing framework.
The lack of parallel structure within each time scale, however--between
summary, climate science, climate history and resources--is
a bit confusing. When I examine the content in each of these
sections, it seems like often there is information on the summary
page that really is about climate science. Also, the links and
inquiry questions in the resources section are generally quite
diverse and seem to apply to/expand upon information from both
the time scale's climate science and climate history sections.
Most all the information seems valuable, but it might be better
organized. Again, learning objectives will help to guide a revised
Is the organization/navigation system consistent throughout
With the exception
of the table on the CTL Overview page (which I recommend that
you take out), the system is quite consistent.
Is the organization/navigation system easy to learn?
As it is now, there are quite a few distractions, in terms
of navigation options. If you stick to the navigation bar on
the left and redo the other systems so that they support that,
I think it will be much easier to learn.
Can the learner easily
move forward or backward through the materials?
Are the individual units of content sequenced logically?
The overall organization
of content is largely assisted by the time scale framework. Again,
though, the organization within these units needs attention.
Are the individual units of content organized and sized effectively?
In order to answer
this question, I perused several sets of the time scales pages
(1 year, 10 year and 100 year). My general comments about the
organization of these pages can be found in the previous evaluation
summary on logical hierarchies (see above). In addition these
observations, I found there was a lot to look at on each climate
science and climate history page. If one actually linked to all
the links on the pages, it quickly became an overwhelming amount
of information to take in, particularly without having clear learning
objectives for each page.
It may be that each
individual concept would be more effectively presented if it were
on a page of its own. Given that many of the concepts within the
time scale pages overlap/interact, some kind of "concept relationship
summary" might also be useful. A couple of the summary pages did
begin to provide this kind of information, though a more thorough
and consistent explanation of concept relationships would be useful
for all the different time scales pages.
I also found myself
wishing for some way to understand the time scales in relationship
to one another and to quickly compare/contrast them as a learner.
Currently, they all begin quite differently and contained highly
variable content. While I'm sure that creating a parallel content
structure might be challenging, given the different issues in
each scale, is there a way to at least begin the units from a
common ground? It would greatly help integrate the whole time
Are the individual content units complete, accurate, and up-to-date?
I found myself wondering
mostly about the completeness of the content. Which climate science/history
topics were represented and why? Are there others that ought to
be? Who should have input into these content selection decisions?
Would they be different if faculty from other disciplines were
Does the tool integrate recognized content standards for the
Again, this will depend
on how you define learning objectives and target audience. This
may be somewhat complicated if you decide to try and make this
an interdisciplinary tool.
If there are tests, do they match original learning objectives?
Currently, there are
Are there outside resources that target the same T/L objectives
as this tool?
you've solidified T/L objectives, this will be important to research.
Does this tool promote these T/L objectives in unique/innovative
Again, this will be
a critical consideration once your T/L objectives are formulated.
Is there a dominant T/L method that shapes the activities within
As was mentioned earlier,
most of the content is presented in an information "database"
format rather than as learning activities. The most common method
for presentation is didactic presentation.
Do the T/L activities within the tool employ mixed methods?
Most information is
presented in a didactic manner. There are questions related to
each time scale section in the Inquiry ???'s section. (Though
this section is titled "Inquiry ???'s", I found the questions
to be more reflective summary questions and integrative questions
than they are true inquiry questions.) A few of the website links
employ other teaching/learning strategies, but again, by in large,
a didactic presentation mode is the primary method employed.
Do the T/L activities reflect current research about effective
Again, the tool seems
to be more of an information database than a curriculum or a collection
of targeted T/L activities. To answer the questions below, I have
treated my overall experience reviewing website content as a learning
IE: Do they stimulate interest and curiosity?
Could they do so
more profoundly? Yes, again. This content seems especially
well suited to a problem-based learning method (or case study
Do they provide opportunities for active engagement?
Do they support
Do they exercise different levels of knowledge?
Because the preferred
method for presenting content is didactic, there are few places
for students to move beyond basic levels of knowledge. The
higher levels of knowledge (application, analysis, synthesis,
etc.) are not frequently exercised in the current design.
Are there avenues
for regular and useful feedback to students?
Do the T/L l strategies
and activities capitalize on the advantages of its particular
IE: Does the tool
make effective use of the inherent opportunities for:
access to pertinent, internet-based resources
Obviously, a ton
of time has gone into researching relevant links. Similar
energy will be required in order to make purposeful teaching/learning
use of these outside resources. As a general comment, I found
the huge number of links on each page to be distracting. In
the site's revision, all links ought to clearly support learning
objectives for each page. Specific instructions/questions
guiding students use of these outside resources will also
improve the tool. The links that are not directly applicable
to supporting specific learning objectives for each page ought
to be moved to your related resources section. These lists
would be greatly improved by site summaries (one to two sentences
max). One thing to consider in future design would be to create
an area of the page layout that would contain questions/related
resources to extend learning. This approach would remove the
distracting links from the primary text, but keep them close
enough (in terms of visual proximity) that students could
easily launch into related areas of study.
boundriless (in terms of space and time) communication
The current design
does not make use of these opportunities. They might be something
to consider for future revisions.
of learning process
visualization/simulation of difficult concepts
The current design
has a rich collection of still graphics to illustrate concepts.
Creating separate pages for individual concepts may help to
reduce the visual overload problem that plagues some of the
GRAPHIC DESIGN ISSUES
This is another area
of the site that needs significant attention. In the current design,
I found graphic design issues interfering with my "learning" on
a fairly regular basis (see below for specifics). If it makes
you feel any better, in 99% of the projects I review (that are
created by faculty teams), this is a major area of concern. We
were never trained in this area of expertise and most of us have
no desire to pursue this kind of training. Since these issues
do greatly impact the accessibility of your content, I highly
recommend hiring a graphic design consultant to make design suggestions
for your future project.
Does the design follow
basic graphic design principles?
Is the page layout
helpful, consistent and free of graphic clutter?
Is the graphic placement of the navigation system(s) useful/helpful?
Again, I think the
one strong visual element in the current tool is the navigation
bar on the left of the screens. The others need revisiting, as
I mentioned earlier.
Is there one clear
visual focal point on each page/screen?
Is color used intentionally
and effectively? (for title text, body text, background, etc.)
Does each graphic element
support the learning objectives for the screen?
Do the graphics enhance
the overall T/L experience?
Is the text clearly readable?
Although I occasionally
read a section that could use some editing, most of the text reads
Is all borrowed content properly referenced?
I'm assuming that content
that is not referenced is original content. Is that true?
Are the materials free of grammar and spelling errors?
You might want to get
a few outside people to do a careful read, looking for these kinds
of things in your final project.
Are the materials free of discriminatory examples and terminology?
Is the tool easy to
load and operate?
Does the tool perform well across platforms, operating systems,
I used the tool on
a G-4 with Netscape. It crashed on a pretty regular basis. Some
debugging seems to be warranted. I'm not sure how it performs
on other systems but that will be important to explore.
If there are operating
instructions, are they simple and clear?
Design/Development Tasks and Roles for Future CTL Project
Surveying and analyzing
teaching/learning needs and assets
Surveying and analyzing
technology systems and support
Discussing and writing
overall goals for project
Drafting a revised
content outline for "database"
feedback on draft outline
Researching and compiling
shared learning objectives for individual topics within content
Creating content outlines for individual topics, defining subtopics
teaching/learning needs (specific to content topics)
Writing a list of desired
teaching/learning outcomes for select, demonstration topics
to develop demonstration teaching/learning activities (one to
develop a theme-based activity? one for an activity focused on
a specific subtopic?)
teaching/learning methods & strategies for demonstration activities
Drafting design plans
for demonstration activities
on draft designs
Adding additional content
topics to original database content outline, as needed
outline for database content
system for project
on nav/org system revisions
system for project
related content for "database" and for demonstration templates
written content from outside sources
Entering original and
borrowed content into project
Designing page layout/color
scheme for screens
Programming page layout/color
scheme for screens
relevant still graphics from outside sources
original graphics (ie: Flash animations, streaming video/audio,
other original graphics (ie: Flash animations, streaming video/audio,
Updating group members
on project needs/status, coordinating design/development efforts
Testing for technical
flaws, troubleshooting technical difficulties
needs to create future activities "templates", based on demonstration
project in future
What is the estimated timeframe
How many hours a week does
each person have to work on their part of the project?