Updating search results...

Search Resources

513 Results

View
Selected filters:
Statistics and Visualization for Data Analysis and Inference
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

A whirl-wind tour of the statistics used in behavioral science research, covering topics including: data visualization, building your own null-hypothesis distribution through permutation, useful parametric distributions, the generalized linear model, and model-based analyses more generally. Familiarity with MATLAB®, Octave, or R will be useful, prior experience with statistics will be helpful but is not essential. This course is intended to be a ground-up sketch of a coherent, alternative perspective to the "null-hypothesis significance testing" method for behavioral research (but don't worry if you don't know what this means).

Subject:
Applied Science
Computer Science
Engineering
Mathematics
Statistics and Probability
Material Type:
Full Course
Provider:
MIT
Provider Set:
MIT OpenCourseWare
Author:
Frank, Mike
Vul, Ed
Date Added:
01/01/2009
Stebbins
Unrestricted Use
CC BY
Rating
0.0 stars

Stebbins is a game about evolution. Students collect data as predators "eating" colored circles on a colored background, being careful to avoid the poisonous ones. Data analysis reveals how the population changes color over time, and can be used to illuminate a common misconception that individuals change in response to predation. Stebbins is modeled on a non-digital game-like simulation of natural selection created by evolutionary biologist G. Ledyard Stebbins.

Subject:
Life Science
Material Type:
Activity/Lab
Provider:
Concord Consortium
Provider Set:
Concord Consortium
Author:
Concord Consortium
Date Added:
05/14/2021
Stebbins
Unrestricted Use
CC BY
Rating
0.0 stars

Stebbins is a game about evolution. Students collect data as predators “eating” colored circles on a colored background, being careful to avoid the poisonous ones. Data analysis reveals how the population changes color over time, and can be used to illuminate a common misconception that individuals change in response to predation. Stebbins is modeled on a non-digital game-like simulation of natural selection created by evolutionary biologist G. Ledyard Stebbins.

Subject:
Biology
Life Science
Mathematics
Measurement and Data
Statistics and Probability
Material Type:
Activity/Lab
Simulation
Author:
Concord Consortium
Date Added:
08/20/2020
Strategic Marketing Measurement
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

Marketing research may be divided into methods that emphasize understanding "the customer" and methods that emphasize understanding "the market." This course (15.822) deals with the market. The companion course (15.821) deals with the customer.
The course will teach you how to write, conduct and analyze a marketing research survey. The emphasis will be on discovering market structure and segmentation, but you can pursue other project applications.
A major objective of the course is to give you some "hands-on" exposure to analysis techniques that are widely used in consulting and marketing research factor analysis, perceptual mapping, conjoint, and cluster analysis). These techniques used to be considered advanced but now involve just a few keystrokes on most stat software packages.
The course assumes familiarity with basic probability, statistics, and multiple linear regression.

Subject:
Business and Communication
Marketing
Material Type:
Full Course
Provider:
MIT
Provider Set:
MIT OpenCourseWare
Author:
Prelec, Drazen
Date Added:
09/01/2002
Student Precision Using Glass Pipettes
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

This activity is a lab where students gather data using glass pipettes to understand the concept of precision and data analysis.

Subject:
Chemistry
Physical Science
Material Type:
Activity/Lab
Assessment
Lesson Plan
Provider:
Science Education Resource Center (SERC) at Carleton College
Provider Set:
Pedagogy in Action
Author:
Sylvia Hoffstrom
Date Added:
12/09/2011
Studies in Western Music History: Quantitative and Computational Approaches to Music History
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

The disciplines of music history and music theory have been slow to embrace the digital revolutions that have transformed other fields' text-based scholarship (history and literature in particular). Computational musicology opens the door to the possibility of understanding—even if at a broad level—trends and norms of behavior of large repertories of music. This class presents the major approaches, results, and challenges of computational musicology through readings in the field, gaining familiarity with datasets, and hands on workshops and assignments on data analysis and "corpus" (i.e., repertory) studies. Class sessions alternate between discussion/lecture and labs on digital tools for studying music. A background in music theory and/or history is required, and experience in computer programming will be extremely helpful. Coursework culminates in an independent research project in quantitative or computational musicology that will be presented to the class as a whole.

Subject:
Arts and Humanities
Performing Arts
Material Type:
Full Course
Provider:
MIT
Provider Set:
MIT OpenCourseWare
Author:
Cuthbert, Michael
Date Added:
02/01/2012
Syllabus:  Data Analytics
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

Syllabus for the course "CSCI 381/780 - Data Analytics" delivered at Queens College in Spring 2019 by Kumar Ramansenthil as part of the Tech-in-Residence Corps program.

Subject:
Applied Science
Computer Science
Material Type:
Syllabus
Date Added:
02/15/2019
Syllabus:  Probability and Statistics for Computer Science
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

Syllabus for the course "CSC 21700 - Probability and Statistics for Computer Science" delivered at the City College of New York in Spring 2019 by Evan Agovino as part of the Tech-in-Residence Corps program.

Subject:
Applied Science
Computer Science
Mathematics
Statistics and Probability
Material Type:
Syllabus
Date Added:
02/15/2019
Taphonomy Experiment
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

The taphonomy project is a semester-long experiment the students design and run themselves, using the decomposition studies area run by the Criminal Justice department on campus. Following a discussion of taphonomic processes during the first week, the students come up with original questions to test. Working in pairs, they design the experiment, including methods, materials, sampling interval, and taphonomic evaluation. The students set up the experiment in week 3 and monitor it over the course of the semester. They will be required to keep an experimental journal, data from which they will upload to a wiki page. The students will be required to periodically evaluate and comment on other student's projects. The professor will also periodically evaluate the groups' progress periodically through the wiki. The end result is a 20-minute presentation in the style of an oral paper at a conference given the week before finals. This project develops the students' skills in experimental design, data analysis and written, oral and visual communication.

(Note: this resource was added to OER Commons as part of a batch upload of over 2,200 records. If you notice an issue with the quality of the metadata, please let us know by using the 'report' button and we will flag it for consideration.)

Subject:
Biology
Life Science
Material Type:
Activity/Lab
Homework/Assignment
Provider:
Science Education Resource Center (SERC) at Carleton College
Provider Set:
Teach the Earth
Author:
Karen Koy
Date Added:
08/20/2019
Teaching Data Analysis in the Social Sciences: A case study with article level metrics
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

This case study is retrieved from the open book Open Data as Open Educational Resources. Case studies of emerging practice.

Course description:

Metrics and measurement are important strategic tools for understanding the world around us. To take advantage of the possibilities they offer, however, one needs the ability to gather, work with, and analyse datasets, both big and small. This is why metrics and measurement feature in the seminar course Technology and Evolving Forms of Publishing, and why data analysis was a project option for the Technology Project course in Simon Fraser University’s Master of Publishing Program.

The assignment:

“Data Analysis with Google Refine and APIs": Pick a dataset and an API of your choice (Twitter, VPL, Biblioshare, CrossRef, etc.) and combine them using Google Refine. Clean and manipulate your data for analysis. The complexity/messiness of your data will be taken into account”.

Subject:
Applied Science
Information Science
Social Science
Sociology
Material Type:
Case Study
Author:
Alessandra Bordini
Juan Pablo Alperin
Katie Shamash
Date Added:
03/27/2019
Teaching Infographics as Multiliteracy Arguments
Conditional Remix & Share Permitted
CC BY-SA
Rating
0.0 stars

From "The Spectrum of Apple Flavors" to "We are all Zebras: How Rare Disease is Shaping the Future of Healthcare," we find colorful visual displays of information and data used to persuade, inform and delight their audience-readers. Most infographic assignments result in loose collections of related facts and numbers, essentially a collage or poster. Student create displays of unrelated factoids and spurious data correlations and they "ooh" and "ahhh" at beautiful nothings. However, the visual and textual elements of an infographic can culminate in a coherent multimodal argument which prompts inquiry in the creator and the audience.  In order to teach infographics as a claim expressed through visual metaphor, supported by reasoning with evidence in multiple modes, instructors employ a sequence of interventions to invoke the relevant skills and strategies at appropriate moments.  Composing and critiquing infographics can enhance understanding of both the content and rhetoric, since people analyze, elaborate and critique information more deeply when visual and textal modes are combined (Lazard and Atkinson 2014).This pedagogy of reading and writing multiple literacies can be adapted to other multimodal products. For an overview, refer to "Recipe for an Infographic" (Abilock and Williams 2014) which is also listed in the references for this module. We recommend that you experience this process yourself as you teach it to students.   

Subject:
Information Science
Material Type:
Module
Author:
Debbie Abilock
Date Added:
08/25/2017
Teaching Infographics as Multiliteracy Arguments
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

From "The Spectrum of Apple Flavors" to "We are all Zebras: How Rare Disease is Shaping the Future of Healthcare," we find colorful visual displays of infGrotewold, K. (2020, August). Framework for analysis of visual information. In Assessing Visual Materials for Diversity & Inclusivity. https://www.oercommons.org/courseware/lesson/69336/. Licensed as CC BY-NC-SA   

Subject:
Information Science
Material Type:
Activity/Lab
Author:
Aubree Evans
Date Added:
12/09/2020
Teaching Principles Students How to Assess the State of the Economy
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

Principles of economics students are asked to collect and analyze data on a few macro economic aggregates to give them a first taste of empirical work.

Subject:
Business and Communication
Economics
Mathematics
Social Science
Material Type:
Activity/Lab
Provider:
Science Education Resource Center (SERC) at Carleton College
Provider Set:
Quantitative Writing (SERC)
Author:
Steven Greenlaw
Date Added:
08/28/2012
Technology Design: The Movement of Means
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

In order to promote students’ conceptual understanding and learning experience in introductory statistics, a technology task, which focuses on the probability distribution in which means are defined, was created using TinkerPlots, an exploratory data analysis and modeling software. The targeted audiences range from senior high school grade levels to college freshmen who are starting their introductory course in statistics. Students will be guided to explore and discover the movement behaviors of means of a set of numbers randomly generated from a fixed range of values characterized by a predetermined probability distribution. The cognitive, mathematical, technological and pedagogical natures of the task, as well as its association with the statistics education framework based on the Guidelines for Assessment and Instruction in Statistics Education (GAISE) by the American Statistical Association, will be elaborated. A brief discussion on what cognitive design principles this task satisfies will also be provided at the end.

Subject:
Mathematics
Statistics and Probability
Material Type:
Simulation
Provider:
CUNY Academic Works
Provider Set:
Borough of Manhattan Community College
Author:
Yu Gu
Date Added:
01/01/2017
"To Kill a Mockingbird": An Introduction to 1930s America
Unrestricted Use
Public Domain
Rating
0.0 stars

This activity teaches students about the setting of Harper Lee’s famous novel “To Kill a Mockingbird,” which takes place during 3 years (1933–1935) of the Great Depression. Part 1 of this activity can be used before students start reading the novel to help them understand what life was like in the 1930s. In this part, students will examine and answer questions about census documents that feature unemployment numbers and related information. Part 2 can be completed after students have read the first few chapters of the novel. In this part, students will write a piece using the RAFT technique (role, audience, format, topic) to show what they learned about the 1930s and what they have read so far.

Subject:
Mathematics
Statistics and Probability
Material Type:
Activity/Lab
Provider:
U.S. Census Bureau
Provider Set:
Statistics in Schools
Date Added:
10/18/2019
Tools for Reproducible Research
Read the Fine Print
Rating
0.0 stars

Course summary
A minimal standard for data analysis and other scientific computations is that they be reproducible: that the code and data are assembled in a way so that another group can re-create all of the results (e.g., the figures in a paper). The importance of such reproducibility is now widely recognized, but it is still not so widely practiced as it should be, in large part because many computational scientists (and particularly statisticians) have not fully adopted the required tools for reproducible research.

In this course, we will discuss general principles for reproducible research but will focus primarily on the use of relevant tools (particularly make, git, and knitr), with the goal that the students leave the course ready and willing to ensure that all aspects of their computational research (software, data analyses, papers, presentations, posters) are reproducible.

Subject:
Applied Science
Information Science
Material Type:
Full Course
Author:
Karl Broman
Date Added:
08/07/2020
Toward Understanding the Role of Web 2.0 Technology in Self-Directed Learning and Job Performance in a Single Organizational Setting: A Qualitative Case Study, Online Submission, 2016-May
Only Sharing Permitted
CC BY-ND
Rating
0.0 stars

This single instrumental qualitative case study explores and thickly describes job performance outcomes based upon the manner in which self-directed learning activities of a purposefully selected sample of 3 construction managers are conducted, mediated by the use of Web 2.0 technology. The data collected revealed that construction managers are concerned with the performance expected of them, in addition to how well they perform their work-related activities (orientation to learning), indicating that organizations should provide guidelines on the use and expected outcomes of self-directed learning in addition to providing the tools, resources, and time (environmental factors) to match performance needs; construction managers feel that work-related activities expected of them, how well the work-related activities are performed, and consequences for poor performance at work are determining factors in selecting Web 2.0 technologies; while construction managers understand the need for rules restricting the use of Web 2.0 technologies in performing their jobs, they feel these rules do hinder their performance because access to specific information they need to answer a question, solve a problem, or research to learn something new is sometimes restricted; and successful performance outcomes are determined by compliance to expected performance behaviors of others, such as answering a question or solving a problem an architect or superintendent have presented, as well as expectations construction managers have set for themselves. The following are appended: (1) Call for Participation--Web 2.0 Technology Project; (2) Informed Consent Letter and Form/Template; (3) Semistructured Interview Guide; and (4) Permission to Conduct Research Study.

Subject:
Business and Communication
Career and Technical Education
Education
Electronic Technology
Management
Material Type:
Case Study
Author:
Caruso Shirley J
Date Added:
02/22/2022
Transitioning from Excel to MATLAB Diffusion Models
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

This activity is part of a larger module that introduces students to two different ways to model chemical diffusion in minerals: 1) 1D diffusion in Excel using finite differences and 2) 1D diffusion in MATLAB using the same equations. It is designed to help students apply diffusion equations derived previously in class to understand natural zonation of elements in minerals. The students build the model first in Excel, and then in MATLAB to obtain the timescales of diffusive re-equilibration related to magma storage and transport at KÃÂlauea Volcano, Hawai'i. The major goals are to help students transition from visual platforms (e.g., Excel) to writing computer code (e.g., in MATLAB), implementing for loops for iterative calculations, and thinking about how the geologic parameters (temperature, pressure, fO2) affect the model results.

Subject:
Applied Science
Computer Science
Geology
Mathematics
Measurement and Data
Physical Science
Material Type:
Activity/Lab
Homework/Assignment
Provider:
Science Education Resource Center (SERC) at Carleton College
Provider Set:
Teach the Earth
Author:
Kendra Lynn
Date Added:
01/20/2023
Transparency and Open Science Symposium GSA 2019
Unrestricted Use
Public Domain
Rating
0.0 stars

The past decade has seen rapid growth in conversations around and progress towards fostering a more transparent, open, and cumulative science. Best practices are being codified and established across fields relevant to gerontology from cancer science to psychological science. Many of the areas currently under development are of particular relevance to gerontologists such as best practices in balancing open science with participant confidentiality or best practices for preregistering archival, longitudinal data analysis. The present panel showcases one of the particular strengths of the open science movement - the contribution that early career researchers are making to these ongoing conversations on best practices. Early career researchers have the opportunity to blend their expertise with technology, their knowledge of their disciplines, and their vision for the future in shaping these conversations. In this panel, three early career researchers share their insights. Pfund presents an introduction to preregistration and the value of preregistration from the perspective of “growing up” within the open science movement. Seaman discusses efforts in and tools for transparency and reproducibility in neuroimaging of aging research. Ludwig introduces the idea of registered reports as a particularly useful form of publication for researchers who use longitudinal methods and/or those who work with hard-to-access samples. The symposium will include time for the audience to engage the panel in questions and discussion about current efforts in and future directions for transparent, open, and cumulative science efforts in gerontology.

Subject:
Life Science
Social Science
Material Type:
Reading
Author:
Eileen K Graham
Gabrielle N
Jennifer Lodi-smith
Kendra Leigh Seaman
Rita M
Date Added:
08/03/2021
Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis
Unrestricted Use
CC BY
Rating
0.0 stars

Background: Reproducible research is a foundational component for scientific advancements, yet little is known regarding the extent of reproducible research within the dermatology literature. Objective: This study aimed to determine the quality and transparency of the literature in dermatology journals by evaluating for the presence of 8 indicators of reproducible and transparent research practices. Methods: By implementing a cross-sectional study design, we conducted an advanced search of publications in dermatology journals from the National Library of Medicine catalog. Our search included articles published between January 1, 2014, and December 31, 2018. After generating a list of eligible dermatology publications, we then searched for full text PDF versions by using Open Access Button, Google Scholar, and PubMed. Publications were analyzed for 8 indicators of reproducibility and transparency—availability of materials, data, analysis scripts, protocol, preregistration, conflict of interest statement, funding statement, and open access—using a pilot-tested Google Form. Results: After exclusion, 127 studies with empirical data were included in our analysis. Certain indicators were more poorly reported than others. We found that most publications (113, 88.9%) did not provide unmodified, raw data used to make computations, 124 (97.6%) failed to make the complete protocol available, and 126 (99.2%) did not include step-by-step analysis scripts. Conclusions: Our sample of studies published in dermatology journals do not appear to include sufficient detail to be accurately and successfully reproduced in their entirety. Solutions to increase the quality, reproducibility, and transparency of dermatology research are warranted. More robust reporting of key methodological details, open data sharing, and stricter standards journals impose on authors regarding disclosure of study materials might help to better the climate of reproducible research in dermatology. [JMIR Dermatol 2019;2(1):e16078]

Subject:
Applied Science
Biology
Genetics
Health, Medicine and Nursing
Life Science
Material Type:
Reading
Provider:
JMIR Dermatology
Author:
Andrew Niemann
Austin L. Johnson
Courtney Cook
Daniel Tritz
J. Michael Anderson
Matt Vassar
Date Added:
08/07/2020