Ecoinformatics site parent site of Partnership for Biodiversity Informatics site parent site of FIRST - Home


 

 

 



Use_case

Difference between version 88 and version 87:

Lines 8-9 were replaced by lines 8-9
- |4| What is the impact of multiple attempts at online homework on exam scores in my course?|Correlation between number of tries and exam scores\\Regression/partial regression using GPA or other control variables|Registrar demographic data\\Registrar GPA data\\Faculty uploaded exams, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores\\Student grades on individual items|Log-in as myself, check-off my courses that I am interested in analyzing\\With-in each course, I will mark assessment items of interest (homework questions on topic a, assessment questions on topic a)
- |5| How are my students performing on a particular topic?\\Are they doing better on evolution this year (versus last year)?|Pre-test/post-test analysis\\ANOVA of grades on specific assessment items|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores for each year, Student grades on individual assessment items for each year|Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest
+ |4|What is the impact of multiple attempts at online homework on exam scores in my course?|Correlation between number of tries and exam scores\\Regression/partial regression using GPA or other control variables|Registrar demographic data\\Registrar GPA data\\Faculty uploaded exams, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores\\Student grades on individual items|Log-in as myself, check-off my courses that I am interested in analyzing\\With-in each course, I will mark assessment items of interest (homework questions on topic a, assessment questions on topic a)
+ |5|How are my students performing on a particular topic?\\Are they doing better on evolution this year (versus last year)?|Pre-test/post-test analysis\\ANOVA of grades on specific assessment items|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with:\\Student exam scores for each year, Student grades on individual assessment items for each year|Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest
Line 11 was replaced by line 11
- |7| Does a particular teaching innovation (i.e. Avida-ED) impact student learning of a particular topic (i.e. Evolution) in my course?|Correlation between innovation and exam score (or score on particular assessment items on exam) over time|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year| Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest to me, based on keywords or tags\Demographic data\\Rubric for exam or assessment items\\Student response data for exam or assessment items (at T0 and T1)\\Student scores for exam or assessment items (at T0 and T1)
+ |7|Does a particular teaching innovation (i.e. Avida-ED) impact student learning of a particular topic (i.e. Evolution) in my course?|Correlation between innovation and exam score (or score on particular assessment items on exam) over time|Registrar demographic data\\Faculty uploaded assessments and rubric for each year, tagged by topic\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year| Log-in as myself, check-off my courses that I am interested in analyzing\\Within each course, I will mark assessment items of interest to me, based on keywords or tags\Demographic data\\Rubric for exam or assessment items\\Student response data for exam or assessment items (at T0 and T1)\\Student scores for exam or assessment items (at T0 and T1)
Lines 14-18 were replaced by lines 14-18
- |10| For accreditation, I would like to show that our students have learned y.|Comparison of student performance at T0 and T1.|Registrar demographic data\\Faculty uploaded assessments and rubric, tagged with topic and Bloom’s data\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year|Log-in as myself, check off my courses that I am interested in analyzing\\Select my assessment items from topic x
- |11| What levels of understanding are typically targeted by assessments in introductory (or upper) level biology courses at my institution/at institutions similar to mine/at all institutions?|Proportion of lower-level Bloom’s items to higher-level Bloom’s items\\Histogram of data| Faculty uploaded assessments, tagged by Bloom’s data\\Metadata form tagging course level, institution information (size, type)|Search for all assessment items in courses at the 100-level
- |12| What misconceptions do students generally hold about topic d?|Qualitative analysis of student responses and scores to parse out misconceptions| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data| Search for all assessment items on topic d
- |13| What is the impact of course size on the assessments used?|Correlation between course size and Bloom’s taxonomy level| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data\\Metadata form indicating course size|Search for courses of a particular size
- |14| Are low performers in my course also struggling in other courses?|Correlation between course grade or individual assessment grade and university GPA, major GPA| Registrar demographic data\\Registrar GPA data\\Faculty uploaded spreadsheet with student course grades| Log-in as myself, check off my courses that I am interested in analyzing
+ |10|For accreditation, I would like to show that our students have learned y.|Comparison of student performance at T0 and T1.|Registrar demographic data\\Faculty uploaded assessments and rubric, tagged with topic and Bloom’s data\\Faculty uploaded spreadsheet with: Student exam scores for each year; Student grades on individual assessment items for each year|Log-in as myself, check off my courses that I am interested in analyzing\\Select my assessment items from topic x
+ |11|What levels of understanding are typically targeted by assessments in introductory (or upper) level biology courses at my institution/at institutions similar to mine/at all institutions?|Proportion of lower-level Bloom’s items to higher-level Bloom’s items\\Histogram of data| Faculty uploaded assessments, tagged by Bloom’s data\\Metadata form tagging course level, institution information (size, type)|Search for all assessment items in courses at the 100-level
+ |12|What misconceptions do students generally hold about topic d?|Qualitative analysis of student responses and scores to parse out misconceptions| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data| Search for all assessment items on topic d
+ |13|What is the impact of course size on the assessments used?|Correlation between course size and Bloom’s taxonomy level| Registrar demographic data\\Faculty uploaded assessments tagged with topic and Bloom’s data\\Metadata form indicating course size|Search for courses of a particular size
+ |14|Are low performers in my course also struggling in other courses?|Correlation between course grade or individual assessment grade and university GPA, major GPA| Registrar demographic data\\Registrar GPA data\\Faculty uploaded spreadsheet with student course grades| Log-in as myself, check off my courses that I am interested in analyzing
Line 30 was replaced by lines 30-35
- |2|April 13, 2007\\September 18, 2007|Tagging assessment items: with Blooms data, concept categories, some other taxonomy|yes - will support| | |x| | | |Must
+ |2|April 13, 2007\\September 18, 2007\\June 19, 2007|Tagging assessment items: with Blooms data, concept categories, learning objective, some other taxonomy|yes - will support| | |x| | | |Must
+ |19|July 3, 2007|Assessment items will be tagged according to an ontology|yes|x| |x| |x| |Must
+ |33|November 27, 2007|Database will use the NBII Biocomplexity thesaurus as the Biology ontology.|yes|x| |x| | | |Must
+ |32|November 27, 2007|User will be able to search nbii thesaurus and with a mouse click, add that concept tag to their assessment item(s).|yes?|x| |x| | | |Should
+ |34|November 27, 2007|Ontology for a discipline must not be rigid - additions must be possible.|yes?|x| |x| | |x|Must
+ |52|February 4, 2008\\MU-L|Must be able to use different content metadata by discipline. While we can use the NBII thesaurus for biology, we need to have an API that can be used to define other metadata for other disciplines.|yes - will support| | |x| | |x|Must
Line 32 was replaced by lines 37-39
- |4|February 7, 2007|This assessment (or assessment item) has multiple rubrics)|yes - will support|x| |x| | | |Must
+ |4|February 4, 2007\\MU-L\\JLM|This assessment (or assessment item) has multiple rubrics)|yes - will support|x| |x| | | |Must
+ |55|February 4, 2008\\MU-L\\JLM|This assessment has been scored in multiple ways (see above).|yes - will support| | |x| | | |Must have
+ |56|February 4, 2008\\MU-L\\JLM|This assessment has multiple and different raters on two (or more) different rubrics|yes - will support| | |x| | | |Must have
Removed line 34
- |6|June 19, 2007|What is the learning objective associated with this assessment item?|pending|x| |x|x| | |Nice
Line 36 was replaced by lines 42-43
- |8|February 7, 2008|Faculty who upload must agree that any identifying information on student responses has been removed|yes - will support -- how?| | | | | |x|Must
+ |8|February 7, 2008|Faculty who upload data must agree that any identifying information in student responses has been removed|yes - will support -- how?| | | | | |x|Must
+ |18|July 3, 2007\\November 6, 2007|I have additional data/metadata I would like to upload. -or- I made a mistake. Where can I do that?|n/a|x|x|x|x| | |Must
Removed lines 46-47
- |18|July 3, 2007\\November 6, 2007|I have additional data/metadata I would like to upload. -or- I made a mistake. Where can I do that?|n/a|x|x|x|x| | |Must
- |19|July 3, 2007|Assessment items will be tagged according to an ontology, not by categories.|yes|x| |x| |x| |Must
Removed line 56
- |28|November 6, 2007|Demographic data will be stored with individual assessments. Students will not have global identifiers.|no - see question above| | | |x|x|
Removed lines 60-62
- |32|November 27, 2007|User will be able to search nbii thesaurus and with a mouse click, add that concept tag to their assessment item(s).|yes?|x| |x| | | |Should
- |33|November 27, 2007|Database will use the NBII Biocomplexity thesaurus as the Biology ontology.|yes|x| |x| | | |Must
- |34|November 27, 2007|Ontology for a discipline must not be rigid - additions must be possible.|yes?|x| |x| | |x|Must
Removed line 80
- |52|February 4, 2008\\MU-L|Must be able to use different content metadata by discipline. While we can use the NBII thesaurus for biology, we need to have an API that can be used to define other metadata for other disciplines.|yes - will support| | |x| | |x|Must
Removed lines 82-84
- |54|February 4, 2008\\MU-L\\JLM|This assessment has multiple rubrics (i.e., grading rubric, coding rubric for misconceptions, coding rubric for models used, etc).|yes - will support| | |x|x| | |Must have
- |55|February 4, 2008\\MU-L\\JLM|This assessment has been scored in multiple ways (see above).|yes - will support| | |x| | | |Must have
- |56|February 4, 2008\\MU-L\\JLM|This assessment has multiple and different raters on two (or more) different rubrics|yes - will support| | |x| | | |Must have

Back to Use_case, or to the Page History.