Ecoinformatics site parent site of Partnership for Biodiversity Informatics site parent site of FIRST - Home


 

 

 



Use_case

Difference between version 120 and version 74:

Line 1 was replaced by line 1
- !Use Case
+ !!Use Case
At line 2 added 42 lines.
+ The following use cases and database requirements come from weekly meetings of the Database Development Team.
+
+ ||No.||Status||Meeting Date||Use Case Description||Decision Made?||dBase as Test Bank||User training/\\documentation||Content metadata||Course Metadata||Query related||System Requirement||Must/Should/Nice to have
+ |1|partial|April 13, 2007\\September 18, 2007\\June 19, 2007|[Tagging assessment items|Use Case 3]: with Blooms data, concept categories, learning objective, some other taxonomy\\tag in multiple ways\\tag with multiple raters|yes - will support| | |x| | | |Must
+ |2|partial|July 3, 2007|[Assessment items will be tagged according to a domain specific ontology|Use Case 4]\\Biology ontology: NBII Biocomplexity Thesaurus\\Ontologies must not be rigid -- must allow for additions.|yes - will support|x| |x| |x| |Must
+ |3|drop|February 4, 2007\\MU-L\\JLM|[Rubrics|Use Case 5]: database must support multiple rubrics for an assessment item\\multiple scores for an assessment item\\multiple raters of an assessment item|yes - will support|x| |x| | | |Must
+ |4|partial|April 13, 2007|[Data storage|Use Case 11]: Must store raw response data for assessment items\\point values for assessment items and assessments (i.e., entire exams)\\student grade on assessment item|yes - will support| | |x| | | |Must
+ |5|complete|December 4, 2007|[Data return format|Use Case 6]: rectangular|yes - will support| | | | | |x|Must
+ |6|incomplete|February 4, 2008|[Return of assessment items|Use Case 12]: We will use a shopping cart model for users to choose assessment items\\Returned questions will be in QTI format; faculty can use other software to translate to exam|pending|x| | |x| | |Must
+ |7|incomplete|May 18, 2007\\October 9, 2007|[Data upload|Use Case 13]: non-MC exams and student responses.\\Student responses are a scanned image -or- electronic.|yes - will support| | | | | | |Must
+ |8|complete|February 7, 2008|[Security|Use Case 7]: Faculty who upload data must agree that any identifying information in student responses has been removed\\any identifying information in their syllabus that they do not wish to be published in the dBase has been removed.|yes - will support -- how?| | | | | |x|Must
+ |9|complete|April 13, 2007\\May 4, 2007|[Security|Use Case 7]: Student ids will be hashed with a unique identifier\\Faculty and university ids could also be hashed.\\data is returned completely anonymous\\queries are constrained so as to not return a limited dataset that could identify a student.|yes - will support| | | | |x|x|Must
+ |10|partial|July 3, 2007\\November 6, 2007|I have [additional data/metadata|Use Case 14] I would like to upload. Where can I do that?|yes - will support|x|x|x|x| | |Must
+ |11|verify|February 18, 2008\\MU-L\\JLM|[Editing environment|Use Case 15] for data: how can faculty fix errors in data they have uploaded?|yes - will support| | | | | |x|Must
+ |12|complete|April 13, 2007\\September 18, 2007\\KBS|[Minimum metadata requirements|Use Case 8]|yes - will support\\have been defined| | | | | |x|Must
+ |13|complete|May 4, 2007|I would like to [import student response data|Use Case 16] directly from my excel workbook|yes - will support| | | | | |x|Must
+ |14|incomplete|June 19, 2007|[Demographic data|Use Case 9] will be available for all queries.\\Faculty uploaded demographic data must be tagged with the date they were current.|yes - will support| | | |x| | |Must
+ |15|complete|August 30, 2007|The database must handle [multiple instructors|Use Case 17] for a class.|no| | | |x|x| |Must
+ |16|complete|August 30, 2007|Faculty members must be able to indicate pages on an exam that are blank, devoted to a figure, picture or formula|yes - completed?| | | | | |x|Must
+ |17|complete|September 11, 2007|Database will treat [sections of a course as separate courses|Use Case 18].\\How can we alleviate work for a faculty member uploading the same assessment for 60 sections of a very large course?|yes| | | |x|x| |Must
+ |18|complete|October 9, 2007|[Query: track students over time.|Use Case 10]|yes - will support| | | |x|x| |Must
+ |19|drop|May 18, 2007\\June 19, 2007|Metadata query: Query for common misconceptions|yes - will support|x| |x| | | |Must
+ |20|complete|July 31, 2007|[Non-traditional assessments|Use Case 19]: Scientific reasoning instruments, concept inventories, published assessment\\Assessment items will not fit a discipline-based ontology|yes|x|x|x| |x| |Must
+ |21|partial|June 12, 2007|I would like to [query across data levels|Use Case 20] (i.e. query for particular assessment items at an institution over time and courses|yes - will support| | |x|x|x| |Must
+ |22|partial|August 30, 2007|There will be two levels of data access: Private (visible only to the individual who uploaded the data) and Public (visible to all registered users)|yes - will support|x|x| | | | |Should
+ |23|-|November 6, 2007|Multiple taxonomies: the database will support multiple taxonomies. All taxonomies will be viewable by all registered users; however, some taxonomies can only be implemented by specific individuals (e.g., professional society tags of 'good' assessment items)|yes - will support|x| |x| |x| |Should
+ |24|-|December 4, 2007|Query: Search database on a paticular type of institution, based on the Carnegie classification system.|pending| | | | |x| |Should
+ |25|-|December 4, 2007|Save search history syntax in order to repeat search at some point in the future.\\Alternatively, return search syntax with query results.|pending| | | | |x|x|Should
+ |26|-|KBS|I would like to filter out low-level Bloom questions, multiple choice questions, essay questions, etc.|pending| x| | | |x| |Should
+ |27|-|KBS|Fostering collaboration: Faculty contact information will be available for all faculty who opt to provide such information.|yes - will support| | |x|x|x| |Should
+ |28|-|KBS|Unconventional assessment items: database will store the item\\Student response data (excel, word files, etc) will also be stored.|yes - will support|x| | | | |x|Nice
+ |29|-|November 6, 2007|I have no idea how to classify my assessment items.|no| |x| | | | |Nice
+ |30|-|May 18, 2007|Query: Find questions based on format (clicker, mc, short answer, etc)\\Find questions that include a particular format element (graph, image, model)|pending -- can this information be gleaned from the parsing software?|x| |x| |x| |Nice
+ |31|-|June 19, 2007|Query: What concepts are considered difficult across disciplines. Will the database compute dynamic metadata such as the difficulty of an item? Such data would change based on the data from the students. Difficulty and other performance metrics should be dynamic, rather than asking faculty to upload them.|pending - will we provide simple analysis?|x| |x| |x| |Nice
+ |32|-|December 4, 2007|Data analysis: simple analysis of change over time returned with the data.|pending| | | |x|x| |Nice
+ |33|-|January 10, 2008|Syllabus: Will not be query-able and will only be anonymous if the faculty uploading removes identifying information.|yes - will support| | | |x| | |Nice
+ |34|-|Purdue|Who can access data across courses/faculty members? Adiminstrator use to demonstrate compliance of accreditation standards.|pending| | | | | | |Nice
+ |35|-|November 27, 2007|User will be able to search nbii thesaurus and with a mouse click, add that concept tag to their assessment item(s).|yes?|x| |x| | | |Nice
+ |36|-|April 13, 2007|I would like to import data directly from my course management software (e.g. Angel, Blackboard, WebCT, etc)|outside current scope| | | | | |x|Nice
+
+ ----
+ !Research Questions
Lines 7-17 were replaced by lines 49-65
- |4| What is the impact of multiple attempts at online homework on exam scores in my course?|Correlation between number of tries and exam scores\\Regression/partial regression using GPA or other control variables|Registrar demographic data\\Registrar GPA data\\Faculty uploaded exams, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores\\Student grades on individual items|Log-in as myself, check-off my courses that I am interested in analyzing\\With-in each course, I will mark assessment items of interest (homework questions on topic a, assessment questions on topic a)|Demographic data\\Exam questions on topic\\Homework questions on topic\\# attempts per homework problem per student\\Exam scores\\Individual student responses on exam, at assessment item level
- |5| How are my students performing on a particular topic?\\Are they doing better on evolution this year (versus last year)?|Pre-test/post-test analysis\\ANOVA of grades on specific assessment items|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores for each year, Student grades on individual assessment items for each year|Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest|Demographic data\\Exam questions on topic\\Exam rubric\\Student exam responses/scores on topic\\Question order from exam (since easier questions often to come first)
- |6|How are my students performing on questions that are of a higher Bloom’s level?| Descriptive statistics on questions based on classifications – correlation\\Cluster analysis (PCA) of outcomes based on classification, topic, and the interaction of those two|Registrar demographic data\\Faculty uploaded assessments and rubric, tagged with Bloom’s data\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year| Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest to me, based on Bloom’s levels|Demographic data\\Exam questions\\Bloom’s level for each assessment item\\Student response data on each assessment item\\Student scores on each assessment item\\Rubric for each assessment item
- |7| Does a particular teaching innovation (i.e. Avida-ED) impact student learning of a particular topic (i.e. Evolution) in my course?|Correlation between innovation and exam score (or score on particular assessment items on exam) over time|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year| Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest to me, based on keywords or tags\Demographic data\\Rubric for exam or assessment items\\Student response data for exam or assessment items (at T0 and T1)\\Student scores for exam or assessment items (at T0 and T1)
- |8|Is there a difference in course grades between sections? (e.g. Different sections could be taught by different TAs, at different times, etc)|Correlation between section and individual assessment (or assessment items)/course grades (or between TA and section grades)|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores; Student grades on individual assessment items|Log-in as myself, check-off my courses that I am interested in analyzing|Demographic data\\Exam scores from multiple sections\\Individual assessment items from multiple sections\\Student scores on individual assessment items from multiple sections
- |9|How do my students compare with students from another school on topic x?|Correlation between institution and grade on topic x.|Registrar demographic data\\Faculty uploaded assessments and rubrics, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores; Student grades on individual assessment items\\Metadata linking course to a particular type/sized institution|Log-in as myself, check off my courses that I am interested in analyzing\\Select my assessment items from topic x\\Search database for additional school to analyze that are similar in Carnegie categorization, similar in course size, all schools\\Search these assessments for items on topic x|Demographic data\\Individual assessment items on topic x.\\Exam on topic x.\\Student scores on individual assessment items and/or exam on topic x.
- |10| For accreditation, I would like to show that our students have learned y.|Comparison of student performance at T0 and T1.|Registrar demographic data\\Faculty uploaded assessments and rubric, tagged with topic and Bloom’s data\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year|Log-in as myself, check off my courses that I am interested in analyzing\\Select my assessment items from topic x|Demographic data\\Assessments or assessment items on y.\\ Rubric for assessments or assessment items on y.\\ Student response data for assessments or assessment items.\\Student scores for assessments or assessment items on y.
- |11| What levels of understanding are typically targeted by assessments in introductory (or upper) level biology courses at my institution/at institutions similar to mine/at all institutions?|Proportion of lower-level Bloom’s items to higher-level Bloom’s items\\Histogram of data| Faculty uploaded assessments, tagged by Bloom’s data\\Metadata form tagging course level, institution information (size, type)|Search for all assessment items in courses at the 100-level|Bloom’s data for individual assessment items in introductory courses at a particular institution(s)/across institutions
- |12| What misconceptions do students generally hold about topic d?|Qualitative analysis of student responses and scores to parse out misconceptions| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data| Search for all assessment items on topic d|Demographic data\\Assessment items on topic d (if multiple choice, must include the foils)\\Rubric for assessment items on topic d\\Student responses\\Student scores
- |13| What is the impact of course size on the assessments used?|Correlation between course size and Bloom’s taxonomy level| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data\\Metadata form indicating course size|Search for courses of a particular size|Demographic data\\Course size\\Institution size or Carnegie Classification\\Majors or non-majors course\\Bloom’s taxonomy level
- |14| Are low performers in my course also struggling in other courses?|Correlation between course grade or individual assessment grade and university GPA, major GPA| Registrar demographic data\\Registrar GPA data\\Faculty uploaded spreadsheet with student course grades| Log-in as myself, check off my courses that I am interested in analyzing| Demographic data\\Course or assessment grades by students\\University GPA\\Major GPA
+ |4|What is the impact of multiple attempts at online homework on exam scores in my course?|Correlation between number of tries and exam scores\\Regression/partial regression using GPA or other control variables|Registrar demographic data\\Registrar GPA data\\Faculty uploaded exams, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores\\Student grades on individual items|Log-in as myself, check-off my courses that I am interested in analyzing\\With-in each course, I will mark assessment items of interest (homework questions on topic a, assessment questions on topic a)
+ |5|How are my students performing on a particular topic?\\Are they doing better on evolution this year (versus last year)?|Pre-test/post-test analysis\\ANOVA of grades on specific assessment items|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores for each year, Student grades on individual assessment items for each year|Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest
+ |6|How are my students performing on questions that are of a higher Bloom’s level?| Descriptive statistics on questions based on classifications – correlation\\Cluster analysis (PCA) of outcomes based on classification, topic, and the interaction of those two|Registrar demographic data\\Faculty uploaded assessments and rubric, tagged with Bloom’s data\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year| Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest to me, based on Bloom’s levels
+ |7|Does a particular teaching innovation (i.e. Avida-ED) impact student learning of a particular topic (i.e. Evolution) in my course?|Correlation between innovation and exam score (or score on particular assessment items on exam) over time|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year| Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest to me, based on keywords or tags\Demographic data\\Rubric for exam or assessment items\\Student response data for exam or assessment items (at T0 and T1)\\Student scores for exam or assessment items (at T0 and T1)
+ |8|Is there a difference in course grades between sections? (e.g. Different sections could be taught by different TAs, at different times, etc)|Correlation between section and individual assessment (or assessment items)/course grades (or between TA and section grades)|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores; Student grades on individual assessment items|Log-in as myself, check-off my courses that I am interested in analyzing
+ |9|How do my students compare with students from another school on topic x?|Correlation between institution and grade on topic x.|Registrar demographic data\\Faculty uploaded assessments and rubrics, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores; Student grades on individual assessment items\\Metadata linking course to a particular type/sized institution|Log-in as myself, check off my courses that I am interested in analyzing\\Select my assessment items from topic x\\Search database for additional school to analyze that are similar in Carnegie categorization, similar in course size, all schools\\Search these assessments for items on topic x
+ |10|For accreditation, I would like to show that our students have learned y.|Comparison of student performance at T0 and T1.|Registrar demographic data\\Faculty uploaded assessments and rubric, tagged with topic and Bloom’s data\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year|Log-in as myself, check off my courses that I am interested in analyzing\\Select my assessment items from topic x
+ |11|What levels of understanding are typically targeted by assessments in introductory (or upper) level biology courses at my institution/at institutions similar to mine/at all institutions?|Proportion of lower-level Bloom’s items to higher-level Bloom’s items\\Histogram of data| Faculty uploaded assessments, tagged by Bloom’s data\\Metadata form tagging course level, institution information (size, type)|Search for all assessment items in courses at the 100-level
+ |12|What misconceptions do students generally hold about topic d?|Qualitative analysis of student responses and scores to parse out misconceptions| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data| Search for all assessment items on topic d
+ |13|What is the impact of course size on the assessments used?|Correlation between course size and Bloom’s taxonomy level| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data\\Metadata form indicating course size|Search for courses of a particular size
+ |14|Are low performers in my course also struggling in other courses?|Correlation between course grade or individual assessment grade and university GPA, major GPA| Registrar demographic data\\Registrar GPA data\\Faculty uploaded spreadsheet with student course grades| Log-in as myself, check off my courses that I am interested in analyzing
+ |15|How reliable is my exam question taxonomy?|
+ |May 18, 2007|What areas or concepts in biology/chemistry/physics are students having difficulty with?
+ |May 18, 2007|How did students do on all questions of a given topic?\\-within a class\\-within a course\\-within an institution\\-across the database
+ |May 18, 2007|How did students do on a question that required a certain level of thinking?
+ |May 18, 2007|I would like to find those questions on topic X where students did particularly well or poorly (depending on my own criteria of good and bad)
+ |May 18, 2007|Compare performance of two (or more) groups of students on an item or set of items\\-over time (this year versus last year)\\-gender or ethnic background\\-majors or non-majors
Removed lines 20-87
- |?|How reliable is my exam question taxonomy?|
-
- ----
- The following use cases and database requirements come from weekly meetings of the Database Development Team.
- ||Meeting Date||Use Case Description||Decision Made?||dBase as Test Bank||User training/\\documentation||Content metadata||Course Metadata||Query related||System Requirement||Must/Should/Nice to Have
- |June 19, 2007\\KBS|I would like to consolidate and archive my data and metadata.|n/a| | |x|x|
- |April 13, 2007\\September 18, 2007|I would like to tag these assessments with my own Blooms data|yes - will support| | |x| |
- |April 13, 2007\\September 18, 2007|I would like to code these assessments according to my own taxonomy|yes - will support| | |x| |
- |May 18, 2007\\September 18, 2007|I would like to add concept categories to this assessment|yes - will support| | |X| | | |
- |April 13, 2007|I would like to know the answers and point values for this assessment item (or items)|yes - will support|x| |x| | |
- |May 22, 2007|I would like to upload my detailed rubric for this assessment.|yes - will support|x| |x| | |
- |April 13, 2007\\June 19, 2007|I would like to know the grading rubric for this assessment item (or items)|n/a|x| |x| |
- |May 18, 2007\\June 19, 2007|I would like to find assessment items that address a particular misconception| |x| |x| | |
- |June 19, 2007|What is the learning objective associated with this assessment item?|pending|x| |x|x| |
- |May 18, 2007\\October 9, 2007|I would like to upload my non-MC exams and student responses.\\Student responses are a scanned image -or- electronic.|outside the current scope| | | | |
- |April 13, 2007\\May 4, 2007|Identifying information of students, faculty and universities must be removed|yes - will support| | | |x| |
- |April 13, 2007|I would like to import data directly from my course management software (e.g. Angel, Blackboard, WebCT, etc)|outside current scope| | | | | |x
- |May 4, 2007|I would like to import student response data directly from my excel workbook|yes - will support| | | | | |x
- |April 13, 2007\\September 18, 2007|There must be a minimum level of data/metadata included for a submission to be accepted into the database|need to define minimum metadata standards| | | | | |x
- |May 18, 2007|What areas or concepts in biology/chemistry/physics are students having difficulty with?|n/a| | |x| |x|
- |May 18, 2007|How did students do on all questions of a given topic?\\-within a class\\-within a course\\-within an institution\\-across the database|n/a| | |x|x|x
- |May 18, 2007|How did students do on a question that required a certain level of thinking?|n/a| | |x| |x
- |May 18, 2007|I would like to find those questions on topic X where students did particularly well or poorly (depending on my own criteria of good and bad)|n/a| | |x| |x|
- |May 18, 2007|Compare performance of two (or more) groups of students on an item or set of items\\-over time (this year versus last year)\\-gender or ethnic background\\-majors or non-majors|yes - will support| | |x|x|x|
- |May 18, 2007|Find questions based on format (clicker, mc, short answer, etc)|pending?|x| |x| |x|
- |May 18, 2007|Find questions that include a particular format element (i.e. graph, particular model or image, etc).|outside current scope| | | | |
- |June 12, 2007|I would like to query across data levels (i.e. query for particular assessment items at an institution over time and courses|yes - will support| | |x|x|x|
- |June 19, 2007|I would like demographic data returned.|yes - will support| | | |x| |
- |June 19, 2007|I would like to see what concepts are considered difficult across disciplines. Are we going to compute dynamic meta data such as the difficulty of an item that would change based on the data from the students? Difficulty and other performance metrics should be dynamic, rather than asking faculty to upload them.|pending - will we provide simple analysis?|x| |x| |x|
- |July 3, 2007\\November 6, 2007|I have additional data/metadata I would like to upload. -or- I made a mistake. Where can I do that?|n/a|x|x|x|x|
- |July 3, 2007|Assessment items will be tagged according to an ontology, not by categories.|yes|x| |x| |x|
- |July 31, 2007|I would like to upload my scientific reasoning instrument.\\Questions do not fit a discipline-based ontology\\This instrument is administered at multiple institutions.|yes|x|x|x| |x
- |August 30, 2007|Each student, class, course, institution, faculty member must have a unique identifier.|yes - will support| | | | |x|x
- |August 30, 2007|Faculty members must be able to indicate pages on an exam that are blank, devoted to a figure, picture or formula|yes - completed?| | | | | |x
- |August 30, 2007|Who can access the database? (registered users, anyone? What level of data is returned?)|pending|x|x| | |
- |August 30, 2007|The database must handle multiple instructors for a class -and- visiting instructors (i.e. faculty who are not officially on staff at an institution)|no| | | |x|x|
- |September 11, 2007|Database must deal with multiple sections of a single course.|yes| | | |x|x|
- |October 9, 2007|Data is returned completely anonymized/partially anonymized/not anonymized.|pending| | | |x| |x
- |October 9, 2007|I would like to track a cohort of students over time.|yes - will support| | | |x|x|
- |November 6, 2007|Demographic data will be stored with individual assessments. Students will not have global identifiers.|pending - in contrast to immediate question above| | | |x|x|
- |November 6, 2007|Query returns must be constrained so as not to return a limited dataset that could result in the identification of a student.|yes - will support| | | |x|x|
- |November 6, 2007|I would like to find assessment items tagged as 'good' - search quality tags by a professional society.|yes - will support|x| |x| |x|
- |November 6, 2007|I have no idea how to classify my assessment items.|no| |x|
- |November 27, 2007|User will be able to search nbii thesaurus and with a mouse click, add that concept tag to their assessment item(s).|yes?|x| |x| |
- |November 27, 2007|Database will use the NBII Biocomplexity thesaurus as the Biology ontology.|yes|x| |x| |
- |November 27, 2007|Ontology for a discipline must not be rigid - additions must be possible.|yes?|x| |x| | |x
- |December 4, 2007|I would like data returned in a simple, rectangular format.|pending| | | | | |x
- |December 4, 2007|I would like data returned in some other format (what might those be?)|pending| | | | | |x
- |December 4, 2007|I would like to query all R1 institutions, based on the Carnegie classification system.|pending| | | | |x|
- |December 4, 2007|I would like a simple analysis of change over time returned with the data.|pending| | | |x|x|
- |December 4, 2007|I would like my search history saved. Are we saving the query syntax or are we expecting to be able to reproduce the exact dataset? Can probably reproduce the dataset if the date that the data were uploaded is stored in the database and can be made part of the query.|pending| | | | |x|x
- |December 4, 2007|I would like the details of my query returned with the results.|pending| | | | | |x
- |January 10, 2008|Instead of answering a litany of questions, I will upload my course syllabus.|pending| | | |x| |x
- |January 10, 2008|I would like my syllabus to be anonymized when uploaded.|pending| | | |x| |
- |February 4, 2008\\MU-L\\JLM|Will the syllabus be query-able? Doesn't seem likely. Need to revisit minimum metadata standards.|pending|
- |KBS|I would like to filter out low-level Bloom questions, multiple choice questions, essay questions, etc.|pending| x| | | |x|
- |KBS|I would like to find other faculty who are similar to me in teaching approach/assessment items/Bloom's level/etc.|pending| | |x|x|x|
- |KBS|I would like to upload a published assessment (e.g. Concept Inventory).|pending|x|x|x| |
- |KBS|Database must store unconventional assessments (models, pictures, diagrams, etc).|yes - will support|x| | | | |x
- |KBS|All data or a standard subset of data and metadata should be reported for each assessment item.|minimum metadata - pending| | | | |x|x
- |Purdue|I am an administrator and would like to use the database.|pending|
- |Purdue|Administrator: would like to compare faculty based on:\\-student scores\\-average Bloom level per exam\\-average number of students taught a semester\\-course level taught|n/a| | |x|x|x|
- |February 4, 2008\\MU-L\\JLM|Administrator would like to show their program has met specific accreditation standards.|n/a| | |x|x|x|
- |February 4, 2008\\MU-L|Must be able to use different content metadata by discipline. While we can use the NBII thesaurus for biology, we need to have an API that can be used to define other metadata for other disciplines.|yes - will support| | |x| | |x
- |February 4, 2008|How will assessment items be returned? Will users pick out one question at a time, or will they check off those questions they are interested in downloading? What format will questions be returned - xml, QTI, etc?|pending|x| | |x|
- ||February 4, 2008\\MU-L\\JLM|This assessment has multiple rubrics (i.e., grading rubric, coding rubric for misconceptions, coding rubric for models used, etc).|yes - will support| | |x|x| | |Must have
- |February 4, 2008\\MU-L\\JLM|This assessment has been scored in multiple ways (see above).|yes - will support| | |x| | | |
- |February 4, 2008\\MU-L\\JLM|This assessment has multiple and different raters on two (or more) different rubrics|yes - will support| | |x| | | |Must have

Back to Use_case, or to the Page History.