NOAA National Environmental Satellite, Data, and Information Service National Climatic Data Center, U.S. Department of Commerce
NOAA Paleoclimatology Program, NCDC Paleoclimatology Branch  
Paleoclimatology Navigation Bar Bookmark and Share
NOAA National Environmental Satellite, Data, and Information Service National Oceanic and Atmospheric Administration NOAA National Climatic Data Center U.S. Department of Commerce Paleo Home Data Paleo Projects Paleo Perspectives Education and Outreach About Paleo Program Site Map
Data Access Tutorial Glossary About CTL CTL Site Map
CTL Overview white space
Today white space
1 Year white space
10 Years white space
100 Years white space
1,000 Years white space
10,000 Years white space
100,000 Years white space
Beyond
Search:


white space
   

Instructional Design Evaluation for
The Climate Time Line Information Tool
Evaluated by: Lucie Sommer in May 2002

Following is an evaluation of the existing Climate Time Line proto-type by Instructional Design expert Lucie Sommer who assessed the website from the perspective of how well it serves as an instructional tools and what areas of the site need to be improved in order for it to become a more effective learning tool.

Overview
Teaching/Learning Purpose
Teaching/Learning Methods and Activities
Graphic Design Issues
References


Expanded explanation of evaluation ratings and preliminary recommendations
Teaching/Learning Purpose
Teaching/Learning Methods and Activities
Graphic Design Issues

Potential Design/Development Tasks and Roles for Future CTL Project

Overview

Instructional Design Questions

Instructional Design Questions

Y/N

TEACHING/LEARNING PURPOSE

 

Does the tool provide clear information about the its purpose?

Y-

Is the tool aimed at a specific instructional context?

N

Does the tool target a specific user population?

N

Are the overall teaching/learning (T/L) objectives for the tool clear?

N

Are the specific T/L objectives for individual content units clear?

N

Do the T/L objectives meet the needs of the targeted users?

?

CONTENT ORGANIZATION

 

If there is a content introduction or overview, is it clear and useful?

Y&N

If there are content summaries or reviews, are they clear and useful?

N

Does the tool's organization/navigation system enhance the T/L experience?

Y&N

Does the organization/navigation system employ logical hierarchies?

N

Is the organization/navigation system consistent throughout the materials?

Y&N

Is the organization/navigation system easy to learn?

N

Can the learner easily move forward or backward through the materials?

Y

Are the individual units of content sequenced logically?

Y&N

Are the individual units of content organized and sized effectively?

N

CONTENT QUALITY

 

Are the individual content units complete, accurate, and up-to-date?

?

Does the tool integrate recognized content standards for the subject matter?

?

If there are tests, do they match original learning objectives?

NA


TEACHING-LEARNING METHODS and ACTIVITIES

 

Are there outside resources that target the same T/L objectives as this tool?

?

Does this tool promote these T/L objectives in unique/innovative ways?

?

Is there a dominant T/L method that shapes the activities within the tool?

Y

Do the T/L activities within the tool employ mixed methods?

Y-

Do the T/L activities reflect current research about effective T/L practice?

 

IE: Do they stimulate interest and curiosity?

Y

Do they provide opportunities for active engagement?

N

Do they support self-directed learning?

Y

Do they exercise different levels of knowledge?

N

Are there avenues for regular and useful feedback to students?

N

Do the T/L strategies and activities capitalize on the advantages of the selected media?

 

IE: Does the tool make effective use of its inherent opportunities for:

 

access to pertinent, internet-based resources

Y&N

boundaries (in terms of space and time) communication and collaboration

N

autonomous control of learning process

Y

visualization/simulation of difficult concepts

Y

GRAPHIC DESIGN ISSUES

 

Does the design follow basic graphic design principles?

N

Is the page layout helpful, consistent and free of graphic clutter?

N

Is the graphic placement of the navigation system(s) useful/helpful?

Y&N

Is there one clear visual focal point on each page/screen?

N

Is color used intentionally and effectively? (for title text, body text, background, etc.)

N

Does each graphic element support the learning objectives for the screen?

N

Do the graphics enhance the overall T/L experience?

N

TEXT ISSUES

 

Is the text clearly readable?

Y

Is all borrowed content properly referenced?

?

Are the materials free of grammar and spelling errors?

Y

Are the materials free of discriminatory examples and terminology?

Y

TECHINCAL ISSUES

 

Is the tool easy to load and operate?

Y-

Does the tool perform well across platforms, operating systems, etc.?

?

If there are operating instructions, are they simple and clear?

NA

References:

Barker, L. & Weston, T. (2000) "Educational Web Site Evaluation Matrix", ATLAS Evaluation and Research Group, University of Colorado.

Bass, Randall, (2000). "Technology, Evaluation, and the Visibility of Teaching and Learning", New Directions for Teaching and Learning, 83, 47, 35-51.

Chickering, A.W. and Gamson, Z.F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39: 3-7.

W Dick and L Carey, The systematic design of instruction 4th edition, (New York: Harper Collins, 1996).

J.E. Kemp, G. R. Morrison, and S.M. Ross, Designing Effective Instruction, (Upper Saddle River NJ, Prentice Hall, 1998).

Ehrmann, S. (1995) Asking the right questions: what does research tell us about technology and higher learning. Change, March/April.

Lieb, Stephen. Principles of Adult Learning. Publication source unknown, posted on-line at: <http://www.hcc.hawaii.edu/intranet/committees/FacDevCom/guidebk/teachtip/adults-2.htm>

Madhumita, & Kumar, K.L. (1995). Twenty-one effective guidelines for effective instructional design. Educational Technology 35(3): 58-61.

Murk, P.J. (1993). Tested techniques for teaching adults. Adult Learning 4(4): 26.

Pisik, G.B. (1997). Is this course instuctionally sound? A guide to evaluating online training courses. Educational Technology, 37 (4): 50-58.

Smith, P.L., and Ragan, T.J. (1999). Instructional Design. New York: John Wiley and Sons, Inc.


Expanded explanation of evaluation ratings and preliminary recommendations

Evaluation Questions

Y/N

TEACHING/LEARNING PURPOSE

As we talked about in our first meeting, this is one of the areas of design that needs significant attention. Exploring the tool with an eye towards understanding its current purpose, I find a "database" of climate related resources, organized by a power of ten time scale. The learning context in which this database might be employed has not been defined nor has the intended audience.

Most of the contents in the climate information database seem to be didactic presentations of specific climate topics, supported by corresponding graphics and linking to other web-based, didactic presentations. The tool provides a rich array of climate data for potential teaching/learning activities, but does not really suggest what these activities might look like. The one exception to this are the questions included in each resource section (titled "Inquiry ???"). These areas of the tool begin to define simple learning activities, employing the tool's climate data. These will be made more effective as you better define your teaching/learning audience, contexts and objectives. Depending on how you all decide to go with this in the future, you may want to create two distinct, yet related tools (or two distinct sections of one larger tool)--one tool that provides the information database, and another that supports effective instructional integration of this data.

 

Does the tool provide clear information about the its purpose?

I found a couple of statements aimed at defining the purpose of the tool:

"The Climate Time Line Information Tool (CTL) ... a tool for exploring the complex world of climate science and history."

"The Climate Time Line Information Tool is designed as an interactive matrix to allow users to examine climate information at varying scales through time."

As you hone in on a particular user group and tighten your learning objectives for the tool, your statement of purpose should also be changed to be more specific.

Y-

Is the tool aimed at a specific instructional context?

See above.

N

Does the tool target a specific user population?

See above. When you analyze your population it will be important to consider not only their educational profiles (including both student and faculty member needs and assets) but also their systems profiles (as in what type of technology will they be using and what type of human support is in place to support their use of the tool).

N

Are the overall teaching/learning (T/L) objectives for the tool clear?

See above.

N

Are the specific T/L objectives for individual content units clear?

I thought it might be helpful to apply the general comments above to a specific example. I chose the drought "tutorial" as a place to explore. As with many of the site's pages, there was a rich array of content presented (both on your site's pages and on the linked sites), but I was very unclear exactly where to focus my attention or what I was supposed to be learning. I looked for these expectations to be spelled out in the "Introduction" section, but did not find anything like that there. The lack of clear learning objectives made it tough for me to effectively access/benefit from the information presented. For instance, in the "Drought history" section, there are links to all the CTL time scale pages along with several others. I found myself wondering, how much of the linked pages I ought to explore and what exactly I was supposed to be learning by visiting these pages. Without clear expectations I went looking for anything I could find about drought. In order to do this, I had to sort through a tremendous amount of extraneous information. I was quickly frustrated. After "wandering" through a myriad of drought related information housed within the various sections of the tutorial, I was then presented with the task of applying my learning in the "Drought data inquiry" section. Not having been clear on exactly what I was to be studying in previous sections, I performed poorly on the self-test. I had put energy into my exploration of materials, but apparently not in the correct areas. This was also frustrating. Generally, I wanted more clear information about the mileposts on my learning journey and also about my ultimate destination.

I do like the idea of theme-based learning, however, and think that the site has great potential to support that kind of approach. Revising the site's content organization system and clarifying LO's will be important first steps in being able to effectively support theme-based learning.

N

Do the T/L objectives meet the needs of the targeted users?

See above.

?

CONTENT ORGANIZATION

 

If there is a content introduction or overview, is it clear and useful?

I located content intros/overviews for the CTL tool itself in the first two paragraphs of text on the home page and on the CTL Overview page. In the first paragraph on the home page, there is text in parenthesis as well as text that links to other web pages--these both greatly distract from the introductory purpose at hand. The text in the second paragraph is generally clear and useful, though as I mentioned above, it may need to be more focused as the tool's purpose is clarified.

The first paragraph of text on the CTL Overview page is also generally clear and useful. The table below this text, however, seems to duplicate the navigation bar on the left of all screens as well as the summary information that can be found in the "Climate Summary Timeline" on the first page (please see comments below about this summary information). Generally, I felt that this table includes too much information to be useful as an overview tool. What users are looking for in an overview section are concise/at-a-glance aids to help them quickly understand what is coming or what is available for more in-depth exploration.

Y&N

I also looked closely at the intros/overviews for the two organizing topics of the tool "Climate Science" and "Climate History". I located these in the 2nd paragraph on the homepage, in the Overview pages for each of these topics and in the titles that introduce the pages related to these content areas. The definitions for Climate Science and Climate History were highly variable throughout the tool. I found this confusing.

For Climate Science, definitions included: "climatic processes", "investigating cycles and systems", "investigating climatic and environmental processes", "Climate Science provides insight into how scientists use data to study climate variability."

For Climate History, definitions included: "specific climate events of the past", "exploring climate events and human development", " Climate History looks at the past 100,000 years of human existence and explores specific climate events which may have challenged or changed human activities."

It will be important to tighten these concepts as you expand the tool in the future.

 

If there are content summaries or reviews, are they clear and useful?

The summaries that I examined most closely were those that summarized the concepts of Climate Science and Climate History. I located these on the home page as rollover text associated with the Climate Summary Timeline, in the table on the CTL Overview page, and on each of the Overview pages for these two concepts.

When I looked at the summary text for the Climate Summary Timeline, what I noticed is that the summaries did not contain parallel information. Moving from left to right on the timeline and reading the summaries for Climate Science, I found information about how weather variability is scaled, what plays a role in variability, what gives rise to climate cycles, how variability can be tracked and analyzed, what influences variability, how data can be used, and temporal term definitions.

For Climate History, the summaries (again, from left to right) focused on what provides info about variability, the impacts of climate variability on human activity, the impacts on socio-ecological conditions, the relationship between climate and people, the description of climate events of the period (for two periods), and what data informs variability. I found the lack of consistency in these summaries confusing.

The same was generally true for the other areas in the tool that summarized these concepts. Tightening learning objectives in future efforts will help to better focus these summaries and to make them more useful. Taking a careful look at the diverse learning foci that are outlined above ought to be helpful in that process.

N

Does the tool's organization/navigation system enhance the T/L experience?

I found the navigation bar on the left of the screen to be quite useful and worthy of the important space it occupies. On the other hand, the navigation options at the top of the screen occupy some prime real estate on each page, but do not seem to merit it. There is probably a less prominent place where they would more effectively reside. Also, they don't seem to be parallel, in terms of logical hierarchy. This is also true for the links at the bottom of the page. These top and bottom areas of the page should be reserved for links that are used again and again.

Also, a word about the "tutorial" title used in the top of screen navigation bar. This is

commonly the title (and often the location) used for an overview screen about how to

Y&N

use the site. I went there looking for this kind of information and was very confused by what I found.

When you have a better idea of your user population, you might want to think about the various ways that they might want to access the site's information. Some kind of index page that organizes by content topic is useful for a site this large and often pairs well with academic user needs. A search engine is another thing to consider.

 

Does the overall organization/navigation system employ logical hierarchies?

The time scale framework itself is a useful organizing framework. The lack of parallel structure within each time scale, however--between summary, climate science, climate history and resources--is a bit confusing. When I examine the content in each of these sections, it seems like often there is information on the summary page that really is about climate science. Also, the links and inquiry questions in the resources section are generally quite diverse and seem to apply to/expand upon information from both the time scale's climate science and climate history sections. Most all the information seems valuable, but it might be better organized. Again, learning objectives will help to guide a revised organization system.

Y&N

Is the organization/navigation system consistent throughout the materials?

With the exception of the table on the CTL Overview page (which I recommend that you take out), the system is quite consistent.

Y&N

Is the organization/navigation system easy to learn?

As it is now, there are quite a few distractions, in terms of navigation options. If you stick to the navigation bar on the left and redo the other systems so that they support that, I think it will be much easier to learn.

N

Can the learner easily move forward or backward through the materials?

Y

Are the individual units of content sequenced logically?

The overall organization of content is largely assisted by the time scale framework. Again, though, the organization within these units needs attention.

Y&N

Are the individual units of content organized and sized effectively?

In order to answer this question, I perused several sets of the time scales pages (1 year, 10 year and 100 year). My general comments about the organization of these pages can be found in the previous evaluation summary on logical hierarchies (see above). In addition these observations, I found there was a lot to look at on each climate science and climate history page. If one actually linked to all the links on the pages, it quickly became an overwhelming amount of information to take in, particularly without having clear learning objectives for each page.

It may be that each individual concept would be more effectively presented if it were on a page of its own. Given that many of the concepts within the time scale pages overlap/interact, some kind of "concept relationship summary" might also be useful. A couple of the summary pages did begin to provide this kind of information, though a more thorough and consistent explanation of concept relationships would be useful for all the different time scales pages.

N

I also found myself wishing for some way to understand the time scales in relationship to one another and to quickly compare/contrast them as a learner. Currently, they all begin quite differently and contained highly variable content. While I'm sure that creating a parallel content structure might be challenging, given the different issues in each scale, is there a way to at least begin the units from a common ground? It would greatly help integrate the whole time frame scale.

 

CONTENT QUALITY

 

Are the individual content units complete, accurate, and up-to-date?

I found myself wondering mostly about the completeness of the content. Which climate science/history topics were represented and why? Are there others that ought to be? Who should have input into these content selection decisions? Would they be different if faculty from other disciplines were defining them?

?

Does the tool integrate recognized content standards for the subject matter?

Again, this will depend on how you define learning objectives and target audience. This may be somewhat complicated if you decide to try and make this an interdisciplinary tool.

?

If there are tests, do they match original learning objectives?

Currently, there are no tests.

NA

TEACHING-LEARNING METHODS and ACTIVITIES

 

Are there outside resources that target the same T/L objectives as this tool?

Once you've solidified T/L objectives, this will be important to research.

?

Does this tool promote these T/L objectives in unique/innovative ways?

Again, this will be a critical consideration once your T/L objectives are formulated.

?

Is there a dominant T/L method that shapes the activities within the tool?

As was mentioned earlier, most of the content is presented in an information "database" format rather than as learning activities. The most common method for presentation is didactic presentation.

Y

Do the T/L activities within the tool employ mixed methods?

Most information is presented in a didactic manner. There are questions related to each time scale section in the Inquiry ???'s section. (Though this section is titled "Inquiry ???'s", I found the questions to be more reflective summary questions and integrative questions than they are true inquiry questions.) A few of the website links employ other teaching/learning strategies, but again, by in large, a didactic presentation mode is the primary method employed.

Y-

Do the T/L activities reflect current research about effective T/L practice?

Again, the tool seems to be more of an information database than a curriculum or a collection of targeted T/L activities. To answer the questions below, I have treated my overall experience reviewing website content as a learning activity.

 

IE: Do they stimulate interest and curiosity?

Could they do so more profoundly? Yes, again. This content seems especially well suited to a problem-based learning method (or case study based learning).

Y

Do they provide opportunities for active engagement?

N

Do they support self-directed learning?

Y

Do they exercise different levels of knowledge?

Because the preferred method for presenting content is didactic, there are few places for students to move beyond basic levels of knowledge. The higher levels of knowledge (application, analysis, synthesis, etc.) are not frequently exercised in the current design.

N

Are there avenues for regular and useful feedback to students?

N

Do the T/L l strategies and activities capitalize on the advantages of its particular media?

 

IE: Does the tool make effective use of the inherent opportunities for:

 

access to pertinent, internet-based resources

Obviously, a ton of time has gone into researching relevant links. Similar energy will be required in order to make purposeful teaching/learning use of these outside resources. As a general comment, I found the huge number of links on each page to be distracting. In the site's revision, all links ought to clearly support learning objectives for each page. Specific instructions/questions guiding students use of these outside resources will also improve the tool. The links that are not directly applicable to supporting specific learning objectives for each page ought to be moved to your related resources section. These lists would be greatly improved by site summaries (one to two sentences max). One thing to consider in future design would be to create an area of the page layout that would contain questions/related resources to extend learning. This approach would remove the distracting links from the primary text, but keep them close enough (in terms of visual proximity) that students could easily launch into related areas of study.

Y&N

boundriless (in terms of space and time) communication and collaboration

The current design does not make use of these opportunities. They might be something to consider for future revisions.

N

autonomous control of learning process

Y

visualization/simulation of difficult concepts

The current design has a rich collection of still graphics to illustrate concepts. Creating separate pages for individual concepts may help to reduce the visual overload problem that plagues some of the pages.

Y-

GRAPHIC DESIGN ISSUES

This is another area of the site that needs significant attention. In the current design, I found graphic design issues interfering with my "learning" on a fairly regular basis (see below for specifics). If it makes you feel any better, in 99% of the projects I review (that are created by faculty teams), this is a major area of concern. We were never trained in this area of expertise and most of us have no desire to pursue this kind of training. Since these issues do greatly impact the accessibility of your content, I highly recommend hiring a graphic design consultant to make design suggestions for your future project.

 

Does the design follow basic graphic design principles?

N

Is the page layout helpful, consistent and free of graphic clutter?

N

Is the graphic placement of the navigation system(s) useful/helpful?

Again, I think the one strong visual element in the current tool is the navigation bar on the left of the screens. The others need revisiting, as I mentioned earlier.

Y&N

Is there one clear visual focal point on each page/screen?

N

Is color used intentionally and effectively? (for title text, body text, background, etc.)

N

Does each graphic element support the learning objectives for the screen?

N

Do the graphics enhance the overall T/L experience?

N

TEXT ISSUES

 

Is the text clearly readable?

Although I occasionally read a section that could use some editing, most of the text reads well.

Y

Is all borrowed content properly referenced?

I'm assuming that content that is not referenced is original content. Is that true?

?

Are the materials free of grammar and spelling errors?

You might want to get a few outside people to do a careful read, looking for these kinds of things in your final project.

?

Are the materials free of discriminatory examples and terminology?

Y

TECHINCAL ISSUES

 

Is the tool easy to load and operate?

Y

Does the tool perform well across platforms, operating systems, etc.?

I used the tool on a G-4 with Netscape. It crashed on a pretty regular basis. Some debugging seems to be warranted. I'm not sure how it performs on other systems but that will be important to explore.

?

If there are operating instructions, are they simple and clear?

NA

Potential Design/Development Tasks and Roles for Future CTL Project

DESIGN/DEVELOPMENT TASKS

PERSON(S) RESPONSIBLE

Surveying and analyzing teaching/learning needs and assets

 

Surveying and analyzing technology systems and support

 

Discussing and writing overall goals for project

 

Drafting a revised content outline for "database"

 

Collecting diverse feedback on draft outline

 

Researching and compiling shared learning objectives for individual topics within content database

 

Creating content outlines for individual topics, defining subtopics

 

Identifying shared teaching/learning needs (specific to content topics)

 

Writing a list of desired teaching/learning outcomes for select, demonstration topics

 

Identifying individuals to develop demonstration teaching/learning activities (one to develop a theme-based activity? one for an activity focused on a specific subtopic?)

 

Selecting appropriate teaching/learning methods & strategies for demonstration activities

 

Drafting design plans for demonstration activities

 

Collecting feedback on draft designs

 

Adding additional content topics to original database content outline, as needed

 

Finalizing content outline for database content

 

Redesigning navigation/organization system for project

 

Collecting feedback on nav/org system revisions

 

Programming navigation/organization system for project

 

Programming demonstration templates

 

Writing discipline related content for "database" and for demonstration templates

 

Obtaining relevant written content from outside sources

 

Entering original and borrowed content into project

 

Designing page layout/color scheme for screens

 

Programming page layout/color scheme for screens

 

Creating/editing original still graphics

 

Researching/obtaining relevant still graphics from outside sources

 

Creating/editing other original graphics (ie: Flash animations, streaming video/audio, etc.)

 

Researching/obtaining other original graphics (ie: Flash animations, streaming video/audio, etc.)

 

Updating group members on project needs/status, coordinating design/development efforts

 

Testing for technical flaws, troubleshooting technical difficulties

 

Identifying programming needs to create future activities "templates", based on demonstration activities

 

Revising/maintaining project in future

 

What is the estimated timeframe for development?

How many hours a week does each person have to work on their part of the project?

Dividing Line
Privacy Policy information User Survey link First Gov logo Disclaimer information
Dividing Line
http://www.ncdc.noaa.gov/paleo/ctl/evaluation2.html
Downloaded Saturday, 19-Apr-2014 21:30:43 EDT
Last Updated Wednesday, 20-Aug-2008 11:22:39 EDT by paleo@noaa.gov
Please see the Paleoclimatology Contact Page or the NCDC Contact Page if you have questions or comments.