Lesson 18: Evaluating Our Own Participatory Sensing Campaign
Lesson 18: Evaluating Our Own Participatory Sensing Campaign
Objective:
Students will create statistical questions and evaluate their Participatory Sensing Campaign.
Materials:
-
Campaign Creation handout (LMR_U3_L17_B) from previous lesson
-
Class Campaign Information from Lesson 16
Essential Concepts:
Essential Concepts:
Statistical investigative questions guide a Participatory Sensing Campaign so that we can learn about a community or ourselves. These campaigns should be evaluated before implementing to make sure they are reasonable and ethically sound.
Lesson:
-
Review homework by giving students about five minutes to share their classifications in their teams. They will decide as a team which classification is the most fitting.
-
Once the five minutes have passed, have a class discussion of classifications and their justifications. Explain to the class that the campaign must be carried out by the whole class so if it has been classified in the Individual category, it must be revised. Also discuss whether the campaign is feasible. (For example, is the trigger so rare that no one will collect data? Are the questions too intrusive?)
-
Inform students that one of the promises of PS is its potential for helping people bring about social and civic change. Ask teams to consider the following questions and report back:
-
Does our campaign try to do this?
-
Could it be changed or modified to do this?
Note: Feasible campaigns fall under the groups of people or community categories. If a campaign is in the individual category, it should be modified to fall under the other categories before moving to round 4.
-
-
Display the campaign information students generated (and selected as a class) the previous day or revised today: Topic, Research question, Trigger, and Type of Data needed.
-
Now they will continue the rounds using the Campaign Creation handout LMR_U3_L17_B from the previous lesson.
-
Round 4: Now that the class has decided on a trigger and the type of data needed, they will create survey questions to ask when the trigger is set. The questions should consider all of the possible data they might collect at this trigger event. It's ok if the list is long; the goal is to be creative and think of lots of different ideas.
Examples of survey questions for practicing cello are:
“How long did you practice?”
“What did you play?”
“How would you rate your practice session: 1 to 5?”
“Any thoughts or comments about your practice?”
-
Once teams have created 4 survey questions for their group, have teams share out their survey questions. As a class, decide on no more than 10 survey questions that will be used for creating a new Campaign.
-
Then, evaluate each survey question. For each question they should consider:
-
What type of data will this question collect? (Numerical, discrete numerical, text, categories, photos, location).
-
How does this question help address the research question?
-
Does the question need to be reworded? (Is it clear what is being asked for? Do they know how to answer it?)
-
-
If the survey questions need to be rewritten, assign teams to rewrite survey questions. Then, as a class, decide on the changes.
-
Once finalized, write the survey question that goes along with that data variable, being cognizant of question bias.
-
-
Round 5: In teams, now generate two to three statistical questions that they might answer with these data. Make sure your statistical investigative questions are interesting and relevant to the class topic of interest. They may keep a record in their DS Journals. Remind students that they will also have data about the date, time, and place of data collection.
Examples of statistical investigative questions that can be answered for practicing cello are:
“How frequently do I practice?”
“When I practice more frequently, do I rate my sessions higher?”
“Are higher-rated sessions associated with time of day?”
-
Once teams have generated their statistical investigative questions, have them share out with the class. Confirm that the questions are statistical and that they can be answered with the data the students propose to collect.
-
Now that they have all the pieces of the campaign, evaluate whether it’s a reasonable and ethically sound campaign. Engage the class in a whole group discussion on the following questions:
-
Are answers to your survey questions likely to vary when the trigger occurs? (If not, you'll get bored entering the same data again and again)
-
Can the entire class carry out the campaign?
-
Do triggers occur so rarely that you'll have very little data? Do they occur so often that you'll get frustrated entering too much data?
-
Ethics: Would sharing these data with strangers or friends be embarrassing or undermine someone's privacy?
-
Can you change your trigger or survey questions to improve your evaluation?
-
Will you be able to gather enough relevant data from your survey questions to be able to answer your statistical investigative questions?
-
-
Students have collaboratively created their first Participatory Sensing campaign. Inform them that you will be demonstrating one tool used to create the campaigns that they see on their smart devices or the computer. Students should take notes in their DS journals, as they will be using the tool later.
Class Scribes:
One team of students will give a brief talk to discuss what they think the 3 most important topics of the day were.