This is version 89.
It is not the current version, and thus it cannot be edited.
[Back to current version]
[Restore this version]
Question # | Question Text | Analysis | Data Sources | How user might query |
1 | Have my course grades declined over the years? | ANOVA of course grades, individual exam grades | Faculty uploaded spreadsheet with records of student grades for each year | Log-in as myself, check-off my courses that I am interested in analyzing |
2 | Why are my course grades declining? Are more students taking AP (bio) and placing out of this course? | Relationship among ACT/SAT scores, AP bio scores, course grades | Registrar demographic data: ACT/SAT scores, AP Bio scores; course grades; Faculty uploaded spreadsheet with records of student grades for each year | Log-in as myself, check-off my courses that I am interested in analyzing |
3 | Has the use of clickers improved student learning in my course? | Correlation (regression) between clicker use, clicker score(s) and exam scores | Registrar demographic data; Faculty uploaded exams, tagged by topic; Faculty uploaded clicker questions, tagged by topic; Faculty uploaded spreadsheet with: Annual records of student grades, Student grades on individual assessment items, Student grades or responses on individual clicker items | Log-in as myself, check-off my courses that I am interested in analyzing. Within courses, I will mark assessment items of interest (clicker questions on topic a, assessment questions on topic a) |
4 | What is the impact of multiple attempts at online homework on exam scores in my course? | Correlation between number of tries and exam scores Regression/partial regression using GPA or other control variables | Registrar demographic data Registrar GPA data Faculty uploaded exams, tagged by topic Faculty uploaded spreadsheet with: Student exam scores Student grades on individual items | Log-in as myself, check-off my courses that I am interested in analyzing With-in each course, I will mark assessment items of interest (homework questions on topic a, assessment questions on topic a) |
5 | How are my students performing on a particular topic? Are they doing better on evolution this year (versus last year)? | Pre-test/post-test analysis ANOVA of grades on specific assessment items | Registrar demographic data Faculty uploaded assessments and rubric for each year, tagged by topic Faculty uploaded spreadsheet with: Student exam scores for each year, Student grades on individual assessment items for each year | Log-in as myself, check-off my courses that I am interested in analyzing Within each course, I will mark assessment items of interest |
6 | How are my students performing on questions that are of a higher Bloom’s level? | Descriptive statistics on questions based on classifications – correlation Cluster analysis (PCA) of outcomes based on classification, topic, and the interaction of those two | Registrar demographic data Faculty uploaded assessments and rubric, tagged with Bloom’s data Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year | Log-in as myself, check-off my courses that I am interested in analyzing Within each course, I will mark assessment items of interest to me, based on Bloom’s levels |
7 | Does a particular teaching innovation (i.e. Avida-ED) impact student learning of a particular topic (i.e. Evolution) in my course? | Correlation between innovation and exam score (or score on particular assessment items on exam) over time | Registrar demographic data Faculty uploaded assessments and rubric for each year, tagged by topic Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year | Log-in as myself, check-off my courses that I am interested in analyzing Within each course, I will mark assessment items of interest to me, based on keywords or tags\Demographic data Rubric for exam or assessment items Student response data for exam or assessment items (at T0 and T1) Student scores for exam or assessment items (at T0 and T1) |
8 | Is there a difference in course grades between sections? (e.g. Different sections could be taught by different TAs, at different times, etc) | Correlation between section and individual assessment (or assessment items)/course grades (or between TA and section grades) | Registrar demographic data Faculty uploaded assessments and rubric for each year, tagged by topic Faculty uploaded spreadsheet with: Student exam scores; Student grades on individual assessment items | Log-in as myself, check-off my courses that I am interested in analyzing |
9 | How do my students compare with students from another school on topic x? | Correlation between institution and grade on topic x. | Registrar demographic data Faculty uploaded assessments and rubrics, tagged by topic Faculty uploaded spreadsheet with: Student exam scores; Student grades on individual assessment items Metadata linking course to a particular type/sized institution | Log-in as myself, check off my courses that I am interested in analyzing Select my assessment items from topic x Search database for additional school to analyze that are similar in Carnegie categorization, similar in course size, all schools Search these assessments for items on topic x |
10 | For accreditation, I would like to show that our students have learned y. | Comparison of student performance at T0 and T1. | Registrar demographic data Faculty uploaded assessments and rubric, tagged with topic and Bloom’s data Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year | Log-in as myself, check off my courses that I am interested in analyzing Select my assessment items from topic x |
11 | What levels of understanding are typically targeted by assessments in introductory (or upper) level biology courses at my institution/at institutions similar to mine/at all institutions? | Proportion of lower-level Bloom’s items to higher-level Bloom’s items Histogram of data | Faculty uploaded assessments, tagged by Bloom’s data Metadata form tagging course level, institution information (size, type) | Search for all assessment items in courses at the 100-level |
12 | What misconceptions do students generally hold about topic d? | Qualitative analysis of student responses and scores to parse out misconceptions | Registrar demographic data Faculty uploaded assessments tagged with topic and Bloom’s data | Search for all assessment items on topic d |
13 | What is the impact of course size on the assessments used? | Correlation between course size and Bloom’s taxonomy level | Registrar demographic data Faculty uploaded assessments tagged with topic and Bloom’s data Metadata form indicating course size | Search for courses of a particular size |
14 | Are low performers in my course also struggling in other courses? | Correlation between course grade or individual assessment grade and university GPA, major GPA | Registrar demographic data Registrar GPA data Faculty uploaded spreadsheet with student course grades | Log-in as myself, check off my courses that I am interested in analyzing |
15 | How reliable is my exam question taxonomy? | |
May 18, 2007 | What areas or concepts in biology/chemistry/physics are students having difficulty with? |
May 18, 2007 | How did students do on all questions of a given topic? -within a class -within a course -within an institution -across the database |
May 18, 2007 | How did students do on a question that required a certain level of thinking? |
May 18, 2007 | I would like to find those questions on topic X where students did particularly well or poorly (depending on my own criteria of good and bad) |
May 18, 2007 | Compare performance of two (or more) groups of students on an item or set of items -over time (this year versus last year) -gender or ethnic background -majors or non-majors |
The following use cases and database requirements come from weekly meetings of the Database Development Team.
No. | Meeting Date | Use Case Description | Decision Made? | dBase as Test Bank | User training/ documentation | Content metadata | Course Metadata | Query related | System Requirement | Must/Should/Nice to Have |
1 | June 19, 2007 KBS | I would like to consolidate and archive my data and metadata. | n/a | | | x | x | | | Must |
2 | April 13, 2007 September 18, 2007 June 19, 2007 | Tagging assessment items: with Blooms data, concept categories, learning objective, some other taxonomy | yes - will support | | | x | | | | Must |
3 | July 3, 2007 | Assessment items will be tagged according to an ontology | yes | x | | x | | x | | Must |
4 | November 27, 2007 | Database will use the NBII Biocomplexity thesaurus as the Biology ontology. | yes | x | | x | | | | Must |
5 | November 27, 2007 | User will be able to search nbii thesaurus and with a mouse click, add that concept tag to their assessment item(s). | yes? | x | | x | | | | Should |
6 | November 27, 2007 | Ontology for a discipline must not be rigid - additions must be possible. | yes? | x | | x | | | x | Must |
7 | February 4, 2008 MU-L | Must be able to use different content metadata by discipline. While we can use the NBII thesaurus for biology, we need to have an API that can be used to define other metadata for other disciplines. | yes - will support | | | x | | | x | Must |
8 | April 13, 2007 May 22, 2007 | Rubric: I would like to upload or download a rubric for this assessment item(s) | yes - will support | x | | x | | | | Must |
9 | February 4, 2007 MU-L JLM | This assessment (or assessment item) has multiple rubrics) | yes - will support | x | | x | | | | Must |
10 | February 4, 2008 MU-L JLM | This assessment has been scored in multiple ways (see above). | yes - will support | | | x | | | | Must have |
11 | February 4, 2008 MU-L JLM | This assessment has multiple and different raters on two (or more) different rubrics | yes - will support | | | x | | | | Must have |
12 | May 18, 2007 June 19, 2007 | I would like to find assessment items that address a particular misconception | n/a | x | | x | | | | Must |
13 | May 18, 2007 October 9, 2007 | I would like to upload my non-MC exams and student responses. Student responses are a scanned image -or- electronic. | yes - will support | | | | | | | Must |
14 | February 7, 2008 | Faculty who upload data must agree that any identifying information in student responses has been removed | yes - will support -- how? | | | | | | x | Must |
15 | July 3, 2007 November 6, 2007 | I have additional data/metadata I would like to upload. -or- I made a mistake. Where can I do that? | n/a | x | x | x | x | | | Must |
16 | April 13, 2007 May 4, 2007 | Identifying information of students, faculty and universities must be removed | yes - will support | | | | x | | | Must |
17 | April 13, 2007 | I would like to import data directly from my course management software (e.g. Angel, Blackboard, WebCT, etc) | outside current scope | | | | | | x | Should |
18 | May 4, 2007 | I would like to import student response data directly from my excel workbook | yes - will support | | | | | | x | Must |
19 | April 13, 2007 September 18, 2007 KBS | There must be a minimum level of data/metadata included for a submission to be accepted into the database | Must define minimum metadata standards | | | | | | x | Must |
20 | May 18, 2007 | Find questions based on format (clicker, mc, short answer, etc) | pending? | x | | x | | x | | Nice |
21 | May 18, 2007 | Find questions that include a particular format element (i.e. graph, particular model or image, etc). | outside current scope | | | | | | | Nice |
22 | June 12, 2007 | I would like to query across data levels (i.e. query for particular assessment items at an institution over time and courses | yes - will support | | | x | x | x | |
23 | June 19, 2007 | I would like demographic data returned. | yes - will support | | | | x | | | Must |
24 | June 19, 2007 | I would like to see what concepts are considered difficult across disciplines. Are we going to compute dynamic meta data such as the difficulty of an item that would change based on the data from the students? Difficulty and other performance metrics should be dynamic, rather than asking faculty to upload them. | pending - will we provide simple analysis? | x | | x | | x | | Nice |
25 | July 31, 2007 | I would like to upload my scientific reasoning instrument. Questions do not fit a discipline-based ontology This instrument is administered at multiple institutions. | yes | x | x | x | | x | | Must |
26 | August 30, 2007 | Each student, class, course, institution, faculty member must have a unique identifier. | yes - will support | | | | | x | x | Must |
27 | August 30, 2007 | Faculty members must be able to indicate pages on an exam that are blank, devoted to a figure, picture or formula | yes - completed? | | | | | | x | Must |
28 | August 30, 2007 | Who can access the database? -- One solution: Two levels of data: Personal data (visible only to the individual who uploaded) and Public (visible to everyone) | pending | x | x | | | | | Should |
29 | August 30, 2007 | The database must handle multiple instructors for a class -and- visiting instructors (i.e. faculty who are not officially on staff at an institution) | no | | | | x | x | | Should |
30 | September 11, 2007 | Database must deal with multiple sections of a single course. | yes | | | | x | x | | Must |
31 | October 9, 2007 | Data is returned completely anonymized/partially anonymized/not anonymized. | yes - completely anonymized | | | | x | | x | Must |
32 | October 9, 2007 | I would like to track a cohort of students over time. | yes - will support | | | | x | x | | Must |
33 | November 6, 2007 | Query returns must be constrained so as not to return a limited dataset that could result in the identification of a student. | yes - will support | | | | x | x | | Must |
34 | November 6, 2007 | I would like to find assessment items tagged as 'good' - search quality tags by a professional society. | yes - will support | x | | x | | x | | Should |
35 | November 6, 2007 | I have no idea how to classify my assessment items. | no | | x | | | | | Must - user documentation |
36 | December 4, 2007 | I would like data returned in a simple, rectangular format. | yes - will support | | | | | | x | Must |
37 | December 4, 2007 | I would like data returned in some other format (what might those be?) | pending | | | | | | x | Must? |
38 | December 4, 2007 | I would like to query all R1 institutions, based on the Carnegie classification system. | pending | | | | | x | | Should |
39 | December 4, 2007 | I would like a simple analysis of change over time returned with the data. | pending | | | | x | x | | Nice |
40 | December 4, 2007 | I would like my search history saved. Are we saving the query syntax or are we expecting to be able to reproduce the exact dataset? Can probably reproduce the dataset if the date that the data were uploaded is stored in the database and can be made part of the query. | pending | | | | | x | x | Should |
41 | December 4, 2007 | I would like the details of my query returned with the results. | pending | | | | | | x | Should |
42 | January 10, 2008 | Instead of answering a litany of questions, I will upload my course syllabus. | pending | | | | x | | x | Must |
43 | January 10, 2008 | I would like my syllabus to be anonymized when uploaded. | pending | | | | x | | | Nice |
44 | February 7, 2008 | User uploading data takes responsibility for anonymizing their syllabus. | yes - will support | | x | | | | | Must |
45 | February 4, 2008 MU-L JLM | Will the syllabus be query-able? Doesn't seem likely. Need to revisit minimum metadata standards. | pending | | | | | | | Nice |
46 | KBS | I would like to filter out low-level Bloom questions, multiple choice questions, essay questions, etc. | pending | x | | | | x | | Should |
47 | KBS | I would like to find other faculty who are similar to me in teaching approach/assessment items/Bloom's level/etc. | pending | | | x | x | x | | Should |
48 | KBS | I would like to upload a published assessment (e.g. Concept Inventory). | pending | x | x | x | | | | Must |
49 | KBS | Database must store unconventional assessments (models, pictures, diagrams, etc). | yes - will support | x | | | | | x | Must |
50 | Purdue | I am an administrator and would like to use the database. | pending | | | | | | | Nice |
51 | Purdue | Administrator: would like to compare faculty based on: -student scores -average Bloom level per exam -average number of students taught a semester -course level taught | n/a | | | x | x | x | | Nice |
52 | February 4, 2008 MU-L JLM | Administrator would like to show their program has met specific accreditation standards. | n/a | | | x | x | x | | Should |
53 | February 4, 2008 | How will assessment items be returned? Will users pick out one question at a time, or will they check off those questions they are interested in downloading? What format will questions be returned - xml, QTI, etc? | pending | x | | | x | | | Must |
How will users query?
- Assessment item (individual question)
- Student responses
- On individual items
- On a group of items (i.e. an exam or user-defined number of questions)
- Concept category
|