Ecoinformatics site parent site of Partnership for Biodiversity Informatics site parent site of FIRST - Home


 

 

 



Use_case

Difference between version 120 and version 12:

Line 1 was replaced by line 1
- !Use Case
+ !!Use Case
Lines 3-6 were replaced by lines 3-41
- ||Question #||Question Text||Analysis||Data Sources||How user might query
- |1|Have my course grades declined over the years?|ANOVA of course grades, individual exam grades|Faculty uploaded spreadsheet with records of student grades for each year|Log-in as myself, check-off my courses that I am interested in analyzing
- |2|Why are my course grades declining? Are more students taking AP (bio) and placing out of this course?|Relationship among ACT/SAT scores, AP bio scores, course grades|Registrar demographic data: ACT/SAT scores, AP Bio scores; course grades; Faculty uploaded spreadsheet with records of student grades for each year|Log-in as myself, check-off my courses that I am interested in analyzing
- |3|Has the use of clickers improved student learning in my course?|Correlation (regression) between clicker use, clicker score(s) and exam scores|Registrar demographic data; Faculty uploaded exams, tagged by topic; Faculty uploaded clicker questions, tagged by topic; Faculty uploaded spreadsheet with: Annual records of student grades, Student grades on individual assessment items, Student grades or responses on individual clicker items
+ The following use cases and database requirements come from weekly meetings of the Database Development Team.
+
+ ||No.||Status||Meeting Date||Use Case Description||Decision Made?||dBase as Test Bank||User training/\\documentation||Content metadata||Course Metadata||Query related||System Requirement||Must/Should/Nice to have
+ |1|partial|April 13, 2007\\September 18, 2007\\June 19, 2007|[Tagging assessment items|Use Case 3]: with Blooms data, concept categories, learning objective, some other taxonomy\\tag in multiple ways\\tag with multiple raters|yes - will support| | |x| | | |Must
+ |2|partial|July 3, 2007|[Assessment items will be tagged according to a domain specific ontology|Use Case 4]\\Biology ontology: NBII Biocomplexity Thesaurus\\Ontologies must not be rigid -- must allow for additions.|yes - will support|x| |x| |x| |Must
+ |3|drop|February 4, 2007\\MU-L\\JLM|[Rubrics|Use Case 5]: database must support multiple rubrics for an assessment item\\multiple scores for an assessment item\\multiple raters of an assessment item|yes - will support|x| |x| | | |Must
+ |4|partial|April 13, 2007|[Data storage|Use Case 11]: Must store raw response data for assessment items\\point values for assessment items and assessments (i.e., entire exams)\\student grade on assessment item|yes - will support| | |x| | | |Must
+ |5|complete|December 4, 2007|[Data return format|Use Case 6]: rectangular|yes - will support| | | | | |x|Must
+ |6|incomplete|February 4, 2008|[Return of assessment items|Use Case 12]: We will use a shopping cart model for users to choose assessment items\\Returned questions will be in QTI format; faculty can use other software to translate to exam|pending|x| | |x| | |Must
+ |7|incomplete|May 18, 2007\\October 9, 2007|[Data upload|Use Case 13]: non-MC exams and student responses.\\Student responses are a scanned image -or- electronic.|yes - will support| | | | | | |Must
+ |8|complete|February 7, 2008|[Security|Use Case 7]: Faculty who upload data must agree that any identifying information in student responses has been removed\\any identifying information in their syllabus that they do not wish to be published in the dBase has been removed.|yes - will support -- how?| | | | | |x|Must
+ |9|complete|April 13, 2007\\May 4, 2007|[Security|Use Case 7]: Student ids will be hashed with a unique identifier\\Faculty and university ids could also be hashed.\\data is returned completely anonymous\\queries are constrained so as to not return a limited dataset that could identify a student.|yes - will support| | | | |x|x|Must
+ |10|partial|July 3, 2007\\November 6, 2007|I have [additional data/metadata|Use Case 14] I would like to upload. Where can I do that?|yes - will support|x|x|x|x| | |Must
+ |11|verify|February 18, 2008\\MU-L\\JLM|[Editing environment|Use Case 15] for data: how can faculty fix errors in data they have uploaded?|yes - will support| | | | | |x|Must
+ |12|complete|April 13, 2007\\September 18, 2007\\KBS|[Minimum metadata requirements|Use Case 8]|yes - will support\\have been defined| | | | | |x|Must
+ |13|complete|May 4, 2007|I would like to [import student response data|Use Case 16] directly from my excel workbook|yes - will support| | | | | |x|Must
+ |14|incomplete|June 19, 2007|[Demographic data|Use Case 9] will be available for all queries.\\Faculty uploaded demographic data must be tagged with the date they were current.|yes - will support| | | |x| | |Must
+ |15|complete|August 30, 2007|The database must handle [multiple instructors|Use Case 17] for a class.|no| | | |x|x| |Must
+ |16|complete|August 30, 2007|Faculty members must be able to indicate pages on an exam that are blank, devoted to a figure, picture or formula|yes - completed?| | | | | |x|Must
+ |17|complete|September 11, 2007|Database will treat [sections of a course as separate courses|Use Case 18].\\How can we alleviate work for a faculty member uploading the same assessment for 60 sections of a very large course?|yes| | | |x|x| |Must
+ |18|complete|October 9, 2007|[Query: track students over time.|Use Case 10]|yes - will support| | | |x|x| |Must
+ |19|drop|May 18, 2007\\June 19, 2007|Metadata query: Query for common misconceptions|yes - will support|x| |x| | | |Must
+ |20|complete|July 31, 2007|[Non-traditional assessments|Use Case 19]: Scientific reasoning instruments, concept inventories, published assessment\\Assessment items will not fit a discipline-based ontology|yes|x|x|x| |x| |Must
+ |21|partial|June 12, 2007|I would like to [query across data levels|Use Case 20] (i.e. query for particular assessment items at an institution over time and courses|yes - will support| | |x|x|x| |Must
+ |22|partial|August 30, 2007|There will be two levels of data access: Private (visible only to the individual who uploaded the data) and Public (visible to all registered users)|yes - will support|x|x| | | | |Should
+ |23|-|November 6, 2007|Multiple taxonomies: the database will support multiple taxonomies. All taxonomies will be viewable by all registered users; however, some taxonomies can only be implemented by specific individuals (e.g., professional society tags of 'good' assessment items)|yes - will support|x| |x| |x| |Should
+ |24|-|December 4, 2007|Query: Search database on a paticular type of institution, based on the Carnegie classification system.|pending| | | | |x| |Should
+ |25|-|December 4, 2007|Save search history syntax in order to repeat search at some point in the future.\\Alternatively, return search syntax with query results.|pending| | | | |x|x|Should
+ |26|-|KBS|I would like to filter out low-level Bloom questions, multiple choice questions, essay questions, etc.|pending| x| | | |x| |Should
+ |27|-|KBS|Fostering collaboration: Faculty contact information will be available for all faculty who opt to provide such information.|yes - will support| | |x|x|x| |Should
+ |28|-|KBS|Unconventional assessment items: database will store the item\\Student response data (excel, word files, etc) will also be stored.|yes - will support|x| | | | |x|Nice
+ |29|-|November 6, 2007|I have no idea how to classify my assessment items.|no| |x| | | | |Nice
+ |30|-|May 18, 2007|Query: Find questions based on format (clicker, mc, short answer, etc)\\Find questions that include a particular format element (graph, image, model)|pending -- can this information be gleaned from the parsing software?|x| |x| |x| |Nice
+ |31|-|June 19, 2007|Query: What concepts are considered difficult across disciplines. Will the database compute dynamic metadata such as the difficulty of an item? Such data would change based on the data from the students. Difficulty and other performance metrics should be dynamic, rather than asking faculty to upload them.|pending - will we provide simple analysis?|x| |x| |x| |Nice
+ |32|-|December 4, 2007|Data analysis: simple analysis of change over time returned with the data.|pending| | | |x|x| |Nice
+ |33|-|January 10, 2008|Syllabus: Will not be query-able and will only be anonymous if the faculty uploading removes identifying information.|yes - will support| | | |x| | |Nice
+ |34|-|Purdue|Who can access data across courses/faculty members? Adiminstrator use to demonstrate compliance of accreditation standards.|pending| | | | | | |Nice
+ |35|-|November 27, 2007|User will be able to search nbii thesaurus and with a mouse click, add that concept tag to their assessment item(s).|yes?|x| |x| | | |Nice
+ |36|-|April 13, 2007|I would like to import data directly from my course management software (e.g. Angel, Blackboard, WebCT, etc)|outside current scope| | | | | |x|Nice
At line 7 added 23 lines.
+ ----
+ !Research Questions
+ ||Question #||Question Text||Analysis||Data Sources||How user might query
+ |1|[Have my course grades declined over the years?|UseCase1]|ANOVA of course grades, individual exam grades|Faculty uploaded spreadsheet with records of student grades for each year|Log-in as myself, check-off my courses that I am interested in analyzing
+ |2|[Why are my course grades declining? Are more students taking AP (bio) and placing out of this course?|UseCase2]|Relationship among ACT/SAT scores, AP bio scores, course grades|Registrar demographic data: ACT/SAT scores, AP Bio scores; course grades; Faculty uploaded spreadsheet with records of student grades for each year|Log-in as myself, check-off my courses that I am interested in analyzing
+ |3|Has the use of clickers improved student learning in my course?|Correlation (regression) between clicker use, clicker score(s) and exam scores|Registrar demographic data; Faculty uploaded exams, tagged by topic; Faculty uploaded clicker questions, tagged by topic; Faculty uploaded spreadsheet with: Annual records of student grades, Student grades on individual assessment items, Student grades or responses on individual clicker items|Log-in as myself, check-off my courses that I am interested in analyzing. Within courses, I will mark assessment items of interest (clicker questions on topic a, assessment questions on topic a)
+ |4|What is the impact of multiple attempts at online homework on exam scores in my course?|Correlation between number of tries and exam scores\\Regression/partial regression using GPA or other control variables|Registrar demographic data\\Registrar GPA data\\Faculty uploaded exams, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores\\Student grades on individual items|Log-in as myself, check-off my courses that I am interested in analyzing\\With-in each course, I will mark assessment items of interest (homework questions on topic a, assessment questions on topic a)
+ |5|How are my students performing on a particular topic?\\Are they doing better on evolution this year (versus last year)?|Pre-test/post-test analysis\\ANOVA of grades on specific assessment items|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores for each year, Student grades on individual assessment items for each year|Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest
+ |6|How are my students performing on questions that are of a higher Bloom’s level?| Descriptive statistics on questions based on classifications – correlation\\Cluster analysis (PCA) of outcomes based on classification, topic, and the interaction of those two|Registrar demographic data\\Faculty uploaded assessments and rubric, tagged with Bloom’s data\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year| Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest to me, based on Bloom’s levels
+ |7|Does a particular teaching innovation (i.e. Avida-ED) impact student learning of a particular topic (i.e. Evolution) in my course?|Correlation between innovation and exam score (or score on particular assessment items on exam) over time|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year| Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest to me, based on keywords or tags\Demographic data\\Rubric for exam or assessment items\\Student response data for exam or assessment items (at T0 and T1)\\Student scores for exam or assessment items (at T0 and T1)
+ |8|Is there a difference in course grades between sections? (e.g. Different sections could be taught by different TAs, at different times, etc)|Correlation between section and individual assessment (or assessment items)/course grades (or between TA and section grades)|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores; Student grades on individual assessment items|Log-in as myself, check-off my courses that I am interested in analyzing
+ |9|How do my students compare with students from another school on topic x?|Correlation between institution and grade on topic x.|Registrar demographic data\\Faculty uploaded assessments and rubrics, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores; Student grades on individual assessment items\\Metadata linking course to a particular type/sized institution|Log-in as myself, check off my courses that I am interested in analyzing\\Select my assessment items from topic x\\Search database for additional school to analyze that are similar in Carnegie categorization, similar in course size, all schools\\Search these assessments for items on topic x
+ |10|For accreditation, I would like to show that our students have learned y.|Comparison of student performance at T0 and T1.|Registrar demographic data\\Faculty uploaded assessments and rubric, tagged with topic and Bloom’s data\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year|Log-in as myself, check off my courses that I am interested in analyzing\\Select my assessment items from topic x
+ |11|What levels of understanding are typically targeted by assessments in introductory (or upper) level biology courses at my institution/at institutions similar to mine/at all institutions?|Proportion of lower-level Bloom’s items to higher-level Bloom’s items\\Histogram of data| Faculty uploaded assessments, tagged by Bloom’s data\\Metadata form tagging course level, institution information (size, type)|Search for all assessment items in courses at the 100-level
+ |12|What misconceptions do students generally hold about topic d?|Qualitative analysis of student responses and scores to parse out misconceptions| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data| Search for all assessment items on topic d
+ |13|What is the impact of course size on the assessments used?|Correlation between course size and Bloom’s taxonomy level| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data\\Metadata form indicating course size|Search for courses of a particular size
+ |14|Are low performers in my course also struggling in other courses?|Correlation between course grade or individual assessment grade and university GPA, major GPA| Registrar demographic data\\Registrar GPA data\\Faculty uploaded spreadsheet with student course grades| Log-in as myself, check off my courses that I am interested in analyzing
+ |15|How reliable is my exam question taxonomy?|
+ |May 18, 2007|What areas or concepts in biology/chemistry/physics are students having difficulty with?
+ |May 18, 2007|How did students do on all questions of a given topic?\\-within a class\\-within a course\\-within an institution\\-across the database
+ |May 18, 2007|How did students do on a question that required a certain level of thinking?
+ |May 18, 2007|I would like to find those questions on topic X where students did particularly well or poorly (depending on my own criteria of good and bad)
+ |May 18, 2007|Compare performance of two (or more) groups of students on an item or set of items\\-over time (this year versus last year)\\-gender or ethnic background\\-majors or non-majors
At line 9 added 7 lines.
+ ----
+ How will users query?
+ *Assessment item (individual question)
+ *Student responses
+ **On individual items
+ **On a group of items (i.e. an exam or user-defined number of questions)
+ *Concept category
Line 12 was replaced by line 77
- Functional requirements (based on use cases) can be found on the [Functional_requirements] page
+ !Functional requirements (based on use cases) can be found on the [Functional_requirements] page

Back to Use_case, or to the Page History.