TIPS Online - July/August 1998: Electronic Surveying: A Decision-Making Tool
Main Index


Internet2: Project Description

Technology for Teaching Institute

Course Design and the New Technologies

Title 5 Regulation Revision Update

CVU Receives Grant

Videoconferencing Rental Charges

Videoconferencing Resources on the Web

Classroom Design for Video Teleconferencing

Electronic Surveying

Distance Learning and Information Competency


Download this issue
(122kb PDF)
(Requires
Acrobat Reader)

Search

search hints
 


Newsletter  BACK ISSUES:
 Volume 2 Issue 7 July/August 1998

Electronic Surveying: A Decision-Making Tool

When the "Connecting the Campuses” project wanted to test an electronic, decision-making model, the project team believed that a surveying software was a vital component for collecting data through email/web sites. The team wanted a software that could electronically create and send a survey, receive the responses, compute statistical calculations on the received information as it arrived, and create understandable charts for use in presentations.

"Decisive Survey" from Decisive Technology in Palo Alto was selected. The particular version that was selected permitted the surveying of 100 individuals at one time. Moreover, the survey could be sent out multiple times. The problem created by this choice was that collected data had to be merged because the "sample" used was greater than 100. The merged analysis had to be conducted in SPSS (a high-powered statistical package) or a spreadsheet/database program.

The software offered four kinds of survey items: (1) multiple-choice that solicited a single response; (2) a different multiple-choice in which respondents could choose multiple answers, including "all of the above;" (3) "rating" in which respondents could select from a range of choices; and (4) "short-answer" in which respondents could provide information in their own words. Taking the advice of the software creators who claimed that respondents lose interest after 10-15 questions, the project team limited the survey instrument to 10 questions initially, a count which was expanded because one question was split to improve understandability. The team developed items for all four options.

Because the team wanted to test the complete turn-around cycle of the survey, a short-time frame was allowed for the full cycle. The e-mail distribution list was created on June 1, 1998. This list contained individuals who had registered for or attended virtual conferences conducted under the project. A day or two later, an e-mail note was sent from the project director to the potential respondents alerting them to watch for the survey. This e-mail also explained what the project team was trying to learn from conducting the survey. By June 5, the survey was transmitted electronically to the list with instructions to complete the response by June 14. Several surveys were returned by remote email systems because addressees were unknown. A number of the addresses were corrected and the surveys resent.

The software generated a survey for each potential respondent. The survey itself arrived as text in an electronic mail message. This e-mail contained several items: (1) some explanatory instructions about completing the survey; (2) e-mail addresses of the survey creator and the project director so that problems could be addressed quickly; (3) the survey itself; and (4) a "survey authentication marker." This software-generated "marker" contains letters and numbers that uniquely identify an individual survey and facilitate the results-collection/follow-up when individuals return their responses.

Upon receiving the email survey, respondents created an e-mail reply-message that contained the survey itself and the responses. Respondents were told to be certain the "marker" appeared in the body of their response or the results could not be processed.

The completed responses were transmitted by e-mail to an address created to receive them. The survey software automatically collected and processed surveys from an e-mail in-box. Some surveys were returned without the authentication marker. This required they be processed manually, either by pasting an authentication marker into the survey or manually entering the data.

The software kept track of responses and provided for follow-up surveys to be sent to those who had not responded. Since the research analyst does know how any particular individual responded, this raises questions about anonymity in some kinds of surveys.



| HOME |
2002
January
February
March
April
2001
January
February
March
April/May
June/July
August

September
October
November
December
2000
January
February
March
April
May
June
July/August
September
Oct/Nov
December

1999
January
February
March
April
May
June
July/August
September
October
November
December
1998
January
February
March
April
May
June
July/August
September
October
November
December
1997
November
December