diff --git a/.gitignore b/.gitignore
new file mode 100644
index 0000000..5b6a065
--- /dev/null
+++ b/.gitignore
@@ -0,0 +1,4 @@
+.Rproj.user
+.Rhistory
+.RData
+.Ruserdata
diff --git a/Code of Conduct.pdf b/Code of Conduct.pdf
new file mode 100644
index 0000000..b987daf
Binary files /dev/null and b/Code of Conduct.pdf differ
diff --git a/HUDK2017.rdf b/HUDK2017.rdf
new file mode 100644
index 0000000..15c0041
--- /dev/null
+++ b/HUDK2017.rdf
@@ -0,0 +1,1368 @@
+
+
+ journalArticle
+
+
+
+
+
+ Bowers
+ Alex J.
+
+
+
+
+
+
+
+ data
+
+
+ data analysis
+
+
+ Decision Making
+
+
+ Dropouts
+
+
+
+ Elementary School Students
+
+
+
+
+ Grades (Scholastic)
+
+
+
+ Identification
+
+
+
+ MULTIVARIATE analysis
+
+
+
+
+ School Districts
+
+
+
+
+ Secondary School Students
+
+
+ 2010/05/00
+ 2014-09-24 19:31:29
+ ERIC
+ en
+ School personnel currently lack an effective method to pattern and visually interpret disaggregated achievement data collected on students as a means to help inform decision making. This study, through the examination of longitudinal K-12 teacher assigned grading histories for entire cohorts of students from a school district (n=188), demonstrates a novel application of hierarchical cluster analysis and pattern visualization in which all data points collected on every student in a cohort can be patterned, visualized and interpreted to aid in data driven decision making by teachers and administrators. Additionally, as a proof-of-concept study, overall schooling outcomes, such as student dropout or taking a college entrance exam, are identified from the data patterns and compared to past methods of dropout identification as one example of the usefulness of the method. Hierarchical cluster analysis correctly identified over 80% of the students who dropped out using the entire student grade history patterns from either K-12 or K-8. (Contains 5 figures.)
+ Analyzing the Longitudinal K-12 Grading Histories of Entire Cohorts of Students: Grades, Data Driven Decision Making, Dropping out and Hierarchical Cluster Analysis
+ Analyzing the Longitudinal K-12 Grading Histories of Entire Cohorts of Students
+
+
+ 15
+ 7
+ Practical Assessment, Research & Evaluation
+ ISSN 1531-7714
+
+
+ attachment
+
+ Bowers_2010_Analyzing the Longitudinal K-12 Grading Histories of Entire Cohorts of Students.pdf
+ 2
+ application/pdf
+
+
+ attachment
+
+
+
+ http://eric.ed.gov/?id=EJ933686
+
+
+ 2014-09-24 19:31:29
+ Snapshot
+ 1
+ text/html
+
+
+ journalArticle
+
+
+
+
+
+ Grunspan
+ Daniel Z.
+
+
+
+
+ Wiggins
+ Benjamin L.
+
+
+
+
+ Goodreau
+ Steven M.
+
+
+
+
+
+
+ Week 2
+
+
+ http://www.lifescied.org/content/13/2/167
+
+
+ 167-178
+ 06/20/2014
+ 2014-08-20 20:21:46
+ www.lifescied.org
+ en
+ Social interactions between students are a major and underexplored part of undergraduate education. Understanding how learning relationships form in undergraduate classrooms, as well as the impacts these relationships have on learning outcomes, can inform educators in unique ways and improve educational reform. Social network analysis (SNA) provides the necessary tool kit for investigating questions involving relational data. We introduce basic concepts in SNA, along with methods for data collection, data processing, and data analysis, using a previously collected example study on an undergraduate biology classroom as a tutorial. We conduct descriptive analyses of the structure of the network of costudying relationships. We explore generative processes that create observed study networks between students and also test for an association between network position and success on exams. We also cover practical issues, such as the unique aspects of human subjects review for network studies. Our aims are to convince readers that using SNA in classroom environments allows rich and informative analyses to take place and to provide some initial tools for doing so, in the process inspiring future educational studies incorporating relational data.
+ Understanding Classrooms through Social Network Analysis: A Primer for Social Network Analysis in Education Research
+ Understanding Classrooms through Social Network Analysis
+
+
+ 13
+ 2
+ CBE-Life Sciences Education
+ ISSN , 1931-7913
+ CBE Life Sci Educ
+ DOI 10.1187/cbe.13-08-0162
+
+
+ attachment
+
+ Grunspan et al_2014_Understanding Classrooms through Social Network Analysis.pdf
+ 2
+ application/pdf
+
+
+ attachment
+
+
+
+ http://www.lifescied.org/content/13/2/167
+
+
+ 2014-08-20 20:21:46
+ Snapshot
+ 1
+ text/html
+
+
+ blogPost
+
+
+ The Chronicle of Higher Education Blogs: Wired Campus
+
+
+
+
+
+
+ Young
+ Jeffrey R.
+
+
+
+
+
+
+
+ http://chronicle.com/blogs/wiredcampus/why-students-should-own-their-educational-data/54329
+
+
+ August 21, 2014
+ 2014-08-23 21:32:22
+ Why Students Should Own Their Educational Data
+
+
+ attachment
+
+
+
+ http://chronicle.com/blogs/wiredcampus/why-students-should-own-their-educational-data/54329
+
+
+ 2014-08-23 21:32:24
+ Chronicle of Higher Education Snapshot
+ 1
+ text/html
+
+
+ journalArticle
+
+
+
+
+
+ Corbett
+ Albert T.
+
+
+
+
+ Anderson
+ John R.
+
+
+
+
+
+
+
+ Education (general)
+
+
+
+
+ empirical validity
+
+
+
+
+ individual differences
+
+
+
+
+ intelligent tutoring systems
+
+
+
+ Learning
+
+
+
+ Management of Computing and Information Systems
+
+
+
+
+ mastery learning
+
+
+
+
+ Multimedia Information Systems
+
+
+
+
+ procedural knowledge
+
+
+
+
+ Psychology, general
+
+
+
+
+ student modeling
+
+
+
+
+ User Interfaces and Human Computer Interaction
+
+
+
+
+ http://link.springer.com.ezp-prod1.hul.harvard.edu/article/10.1007/BF01099821
+
+
+ 253-278
+ 1994/12/01
+ 2013-04-21 21:21:19
+ link.springer.com.ezp-prod1.hul.harvard.edu
+ en
+ This paper describes an effort to model students' changing knowledge state during skill acquisition. Students in this research are learning to write short programs with the ACT Programming Tutor (APT). APT is constructed around a production rule cognitive model of programming knowledge, called theideal student model. This model allows the tutor to solve exercises along with the student and provide assistance as necessary. As the student works, the tutor also maintains an estimate of the probability that the student has learned each of the rules in the ideal model, in a process calledknowledge tracing. The tutor presents an individualized sequence of exercises to the student based on these probability estimates until the student has ‘mastered’ each rule. The programming tutor, cognitive model and learning and performance assumptions are described. A series of studies is reviewed that examine the empirical validity of knowledge tracing and has led to modifications in the process. Currently the model is quite successful in predicting test performance. Further modifications in the modeling process are discussed that may improve performance levels.
+ Knowledge tracing: Modeling the acquisition of procedural knowledge
+ Knowledge tracing
+
+
+ 4
+ 4
+ User Modeling and User-Adapted Interaction
+ ISSN 0924-1868, 1573-1391
+ User Model User-Adap Inter
+ DOI 10.1007/BF01099821
+
+
+ attachment
+
+ Corbett_Anderson_1994_Knowledge tracing.pdf
+ 2
+ application/pdf
+
+
+ conferencePaper
+
+
+
+ LAK '12
+
+ ISBN 978-1-4503-1111-3
+ DOI 10.1145/2330601.2330661
+ Proceedings of the 2Nd International Conference on Learning Analytics and Knowledge
+
+
+
+
+
+
+ New York, NY, USA
+
+
+ ACM
+
+
+
+
+
+
+ Siemens
+ George
+
+
+
+
+ Baker
+ Ryan S. J. d.
+
+
+
+
+
+
+ Collaboration
+
+
+
+ educational data mining
+
+
+
+
+ learning analytics and knowledge
+
+
+
+
+ http://doi.acm.org/10.1145/2330601.2330661
+
+
+ 252–254
+ 2012
+ 2015-01-16 03:15:55
+ ACM Digital Library
+ Growing interest in data and analytics in education, teaching, and learning raises the priority for increased, high-quality research into the models, methods, technologies, and impact of analytics. Two research communities -- Educational Data Mining (EDM) and Learning Analytics and Knowledge (LAK) have developed separately to address this need. This paper argues for increased and formal communication and collaboration between these communities in order to share research, methods, and tools for data mining and analysis in the service of developing both LAK and EDM fields.
+ Learning Analytics and Educational Data Mining: Towards Communication and Collaboration
+ Learning Analytics and Educational Data Mining
+
+
+ attachment
+
+ Siemens_Baker_2012_Learning Analytics and Educational Data Mining.pdf
+ 2
+ application/pdf
+
+
+ book
+
+
+
+
+ Sebastopol, CA
+
+
+ O'Reily Media
+
+
+
+
+
+
+ Zheng
+ Alice
+
+
+
+
+
+
+
+ http://www.oreilly.com/data/free/evaluating-machine-learning-models.csp?intcmp=il-data-free-lp-lgen_free_reports_page
+
+
+ September 2015
+ 2015-12-15 18:26:39
+ Data science today is a lot like the Wild West: there’s endless opportunity and excitement, but also a lot of chaos and confusion. If you’re new to data science and applied machine learning, evaluating a machine-learning model can seem pretty overwhelming...
+ Evaluating Machine Learning Models
+
+
+ attachment
+
+
+
+ http://www.oreilly.com/data/free/evaluating-machine-learning-models.csp?intcmp=il-data-free-lp-lgen_free_reports_page
+
+
+ 2015-12-15 18:26:39
+ Snapshot
+ 1
+ text/html
+
+
+ videoRecording
+
+
+
+
+ Educause
+
+
+
+
+
+
+
+
+ Collier
+ Amy
+
+
+
+
+ Hickey
+ Daniel
+
+
+
+
+ Reich
+ Justin
+
+
+
+
+ Wagner
+ Ellen
+
+
+
+
+ Campbell
+ Gardner
+
+
+
+
+
+ Assessment
+
+
+ Education
+
+
+
+ educational assessment
+
+
+
+ EDUCAUSE
+
+
+
+ Higher Education
+
+
+
+ learners
+
+
+ Learning
+
+
+
+ Teaching and learning
+
+
+
+
+ https://www.youtube.com/watch?v=_iv8A1pHNYA
+
+
+ 2015-08-17
+ 2016-01-17 18:50:57
+ YouTube
+ 470 seconds
+ Several higher education learning and assessment professionals discuss the difficulties of measuring learning.
+ Why Is Measuring Learning So Difficult?
+
+
+ webpage
+
+
+
+
+
+
+
+ Weinersmith
+ Zach
+
+
+
+
+
+
+
+ http://www.smbc-comics.com/index.php?id=3978
+
+
+ January 5 2016
+ 2016-01-18 18:17:09
+ Saturday Morning Breakfast Cereal
+
+
+ attachment
+
+
+
+ http://www.smbc-comics.com/index.php?id=3978
+
+
+ 2016-01-18 18:17:10
+ Saturday Morning Breakfast Cereal
+ 1
+ text/html
+
+
+ webpage
+
+
+
+
+
+
+ RStudio
+
+
+
+
+
+
+ http://www.rstudio.com/wp-content/uploads/2015/02/data-wrangling-cheatsheet.pdf
+
+
+ January 2015
+ 2016-01-18 18:42:27
+ The Data Wrangling Cheatsheet
+
+
+ attachment
+
+
+
+ http://www.rstudio.com/wp-content/uploads/2015/02/data-wrangling-cheatsheet.pdf
+
+
+ 2016-01-18 18:42:27
+ data-wrangling-cheatsheet - data-wrangling-cheatsheet.pdf
+ 1
+ application/pdf
+
+
+ conferencePaper
+
+
+ Proceedings of the Fourth International Conference on Learning Analytics And Knowledge
+
+
+
+ ACM
+
+
+
+
+
+ Clow
+ Doug
+
+
+
+
+ 49–53
+ 2014
+ Data wranglers: human interpreters to help close the feedback loop
+
+
+ magazineArticle
+
+ The Conversation
+
+
+
+
+
+ Kucirkova
+ Natalia
+
+
+
+
+ FitzGerald
+ Elizabeth
+
+
+
+
+
+
+
+ http://theconversation.com/zuckerberg-is-ploughing-billions-into-personalised-learning-why-51940
+
+
+ December 9 2015
+ 2016-01-18 19:14:05
+ Zuckerburg wants to plough billions into personalised learning, but his way may not be the right way.
+ Zuckerberg is ploughing billions into 'personalised learning' – why?
+
+
+ attachment
+
+
+
+ https://theconversation.com/zuckerberg-is-ploughing-billions-into-personalised-learning-why-51940
+
+
+ 2016-01-18 19:14:05
+ Snapshot
+ 1
+ text/html
+
+
+ videoRecording
+
+ Youtube
+
+
+
+
+ Udacity
+
+
+
+
+
+
+
+ Georgia Tech
+
+
+
+
+
+
+
+ https://www.youtube.com/watch?v=8CpRLplmdqE
+
+
+ 23 February 2015
+ 2016-01-18 19:18:06
+ 3:13
+ Feature Selection
+
+
+ attachment
+
+
+
+ https://www.youtube.com/watch?v=8CpRLplmdqE
+
+
+ 2016-01-18 19:18:06
+ Snapshot
+ 1
+ text/html
+
+
+ webpage
+
+ RStudio
+
+
+
+
+
+ Groelmund
+ Garrett
+
+
+
+
+
+
+
+ https://www.rstudio.com/resources/cheatsheets/
+
+
+ 01 August 2014
+ 2016-01-19 21:17:28
+ RStudio Cheat Sheets
+
+
+ attachment
+
+
+
+ http://shiny.rstudio.com/articles/rm-cheatsheet.html
+
+
+ 2016-01-19 21:17:29
+ Shiny - The R Markdown Cheat sheet
+ 1
+ text/html
+
+
+ journalArticle
+
+
+
+
+
+ Greller
+ Wolfgang
+
+
+
+
+ Drachsler
+ Hendrik
+
+
+
+
+
+
+ http://www.jstor.org/stable/jeductechsoci.15.3.42
+
+
+ 42-57
+ 2012
+ 2016-09-03 18:55:41
+ JSTOR
+ ABSTRACT With the increase in available educational data, it is expected that Learning Analytics will become a powerful means to inform and support learners, teachers and their institutions in better understanding and predicting personal learning needs and performance. However, the processes and requirements behind the beneficial application of Learning and Knowledge Analytics as well as the consequences for learning and teaching are still far from being understood. In this paper, we explore the key dimensions of Learning Analytics (LA), the critical problem zones, and some potential dangers to the beneficial exploitation of educational data. We propose and discuss a generic design framework that can act as a useful guide for setting up Learning Analytics services in support of educational practice and learner guidance, in quality assurance, curriculum development, and in improving teacher effectiveness and efficiency. Furthermore, the presented article intends to inform about soft barriers and limitations of Learning Analytics. We identify the required skills and competences that make meaningful use of Learning Analytics data possible to overcome gaps in interpretation literacy among educational stakeholders. We also discuss privacy and ethical issues and suggest ways in which these issues can be addressed through policy guidelines and best practice examples.
+ Translating Learning into Numbers: A Generic Framework for Learning Analytics
+ Translating Learning into Numbers
+
+
+ 15
+ 3
+ Journal of Educational Technology & Society
+ ISSN 1176-3647
+ Journal of Educational Technology & Society
+
+
+ journalArticle
+
+
+
+
+
+ Konstan
+ Joseph A.
+
+
+
+
+ Walker
+ J. D.
+
+
+
+
+ Brooks
+ D. Christopher
+
+
+
+
+ Brown
+ Keith
+
+
+
+
+ Ekstrand
+ Michael D.
+
+
+
+
+
+
+ learning assessment
+
+
+
+
+ Massively Online Open Course (MOOC)
+
+
+
+
+ http://doi.acm.org/10.1145/2728171
+
+
+ 10:1–10:23
+ April 2015
+ 2016-09-03 20:38:02
+ ACM Digital Library
+ Teaching Recommender Systems at Large Scale: Evaluation and Lessons Learned from a Hybrid MOOC
+ Teaching Recommender Systems at Large Scale
+
+
+ 22
+ 2
+ ACM Trans. Comput.-Hum. Interact.
+ ISSN 1073-0516
+ DOI 10.1145/2728171
+
+
+ book
+
+
+ International Educational Data Mining Society
+
+
+
+
+
+
+ Matsuda
+ Noboru
+
+
+
+
+ Furukawa
+ Tadanobu
+
+
+
+
+ Bier
+ Norman
+
+
+
+
+ Faloutsos
+ Christos
+
+
+
+
+
+
+
+ Automation
+
+
+
+ Comparative Analysis
+
+
+
+ Correlation
+
+
+ data
+
+
+
+ Formative Evaluation
+
+
+
+ models
+
+
+ Online Courses
+
+
+ Skills
+
+
+
+ http://eric.ed.gov/?id=ED560513
+
+
+ 2015/06/00
+ 2016-09-03 20:48:57
+ ERIC
+ en
+ How can we automatically determine which skills must be mastered for the successful completion of an online course? Large-scale online courses (e.g., MOOCs) often contain a broad range of contents frequently intended to be a semester's worth of materials; this breadth often makes it difficult to articulate an accurate set of skills and knowledge (i.e., a skill model, or the QMatrix). We have developed an innovative method to discover skill models from the data of online courses. Our method assumes that online courses have a pre-defined skill map for which skills are associated with formative assessment items embedded throughout the online course. Our method carefully exploits correlations between various parts of student performance, as well as in the text of assessment items, to build a superior statistical model that even outperforms human experts. To evaluate our method, we compare our method with existing methods (LFA) and human engineered skill models on three Open Learning Initiative (OLI) courses at Carnegie Mellon University. The results show that (1) our method outperforms human-engineered skill models, (2) skill models discovered by our method are interpretable, and (3) our method is remarkably faster than existing methods. These results suggest that our method provides a significant contribution to the evidence-based, iterative refinement of online courses with a promising scalability. [For complete proceedings, see ED560503.]
+ Machine Beats Experts: Automatic Discovery of Skill Models for Data-Driven Online Course Refinement
+ Machine Beats Experts
+
+
+ attachment
+
+ Matsuda et al_2015_Machine Beats Experts.pdf
+ 2
+ application/pdf
+
+
+ attachment
+
+
+
+ http://eric.ed.gov/?id=ED560513
+
+
+ 2016-09-03 20:48:57
+ Snapshot
+ 1
+ text/html
+
+
+ bookSection
+
+
+ Introduction to Social Network Methods
+
+
+
+
+
+
+ Hanneman
+ R.A.
+
+
+
+
+ Riddle
+ M.
+
+
+
+
+
+
+
+ http://faculty.ucr.edu/~hanneman/nettext/C1_Social_Network_Data.html
+
+
+ 2016-01-18 20:17:24
+ 2016-01-18 20:17:24
+ Chapter 1: Social Network Data
+
+
+ attachment
+
+
+ http://faculty.ucr.edu/~hanneman/nettext/C1_Social_Network_Data.html
+
+
+ 2016-01-18 20:17:25
+ Introduction to Social Network Methods: Chapter 1: Social Network Data
+ 3
+ text/html
+
+
+ bookSection
+
+
+ ISBN 978-0-9952408-0-3
+ The Handbook of Learning Analytics
+
+
+
+
+
+
+ Vancouver, BC, Canada
+
+
+
+
+
+
+
+
+ Klerkx
+ Joris
+
+
+
+
+ Verbert
+ Katrien
+
+
+
+
+ Duval
+ Erik
+
+
+
+
+
+ www.solaresearch.org
+
+ 1
+ 2017
+ EN
+ Learning Analytics Dashboards
+
+
+ bookSection
+
+
+ ISBN 978-0-9952408-0-3
+ The Handbook of Learning Analytics
+
+
+
+
+
+
+ Alberta, Canada
+
+
+ Society for Learning Analytics Research (SoLAR)
+
+
+
+
+
+
+ Bergner
+ Yoav
+
+
+
+
+
+
+
+
+ Lang
+ Charles
+
+
+
+
+ Siemens
+ George
+
+
+
+
+ Wise
+ Alyssa Friend
+
+
+
+
+ Gaševic
+ Dragan
+
+
+
+
+
+
+ http://solaresearch.org/hla-17/hla17-chapter1
+
+
+ 1
+ 34-48
+ 2017
+ Psychological measurement is a process for making warranted claims about states of mind. As such, it typically comprises the following: de ning a construct; specifying a measurement model and (developing) a reliable instrument; analyzing and accounting for various sources of error (including operator error); and framing a valid argument for particular uses of the outcome. Measurement of latent variables is, after all, a noisy endeavor that can neverthe- less have high-stakes consequences for individuals and groups. This chapter is intended to serve as an introduction to educational and psychological measurement for practitioners in learning analytics and educational data mining. It is organized thematically rather than historically, from more conceptual material about constructs, instruments, and sources of measurement error toward increasing technical detail about particular measurement models and their uses. Some of the philosophical differences between explanatory and predictive modelling are explored toward the end.
+ Measurement and its Uses in Learning Analytics
+
+
+ bookSection
+
+
+ ISBN 978-0-9952408-0-3
+ The Handbook of Learning Analytics
+
+
+
+
+
+
+ Alberta, Canada
+
+
+ Society for Learning Analytics Research (SoLAR)
+
+
+
+
+
+
+ Brooks
+ Christopher
+
+
+
+
+ Thompson
+ Craig
+
+
+
+
+
+
+
+
+ Lang
+ Charles
+
+
+
+
+ Siemens
+ George
+
+
+
+
+ Wise
+ Alyssa Friend
+
+
+
+
+ Gaševic
+ Dragan
+
+
+
+
+
+
+ http://solaresearch.org/hla-17/hla17-chapter1
+
+
+ 1
+ 61-68
+ 2017
+ This article describes the process, practice, and challenges of using predictive modelling in teaching and learning. In both the elds of educational data mining (EDM) and learning analytics (LA) predictive modelling has become a core practice of researchers, largely with a focus on predicting student success as operationalized by academic achievement. In this chapter, we provide a general overview of considerations when using predictive modelling, the steps that an educational data scientist must consider when engaging in the process, and a brief overview of the most popular techniques in the eld.
+ Predictive Modelling in Teaching and Learning
+
+
+ bookSection
+
+
+ ISBN 978-0-9952408-0-3
+ The Handbook of Learning Analytics
+
+
+
+
+
+
+ Vancouver, BC
+
+
+ Society for Learning Analytics Research
+
+
+
+
+
+
+ Prinsloo
+ P
+
+
+
+
+ Slade
+ S
+
+
+
+
+
+
+ https://solaresearch.org/hla-17/hla17-chapter4/
+
+
+ 1
+ 49-57
+ March 2017
+ EN
+ Ethics and Learning Analytics: Charting the (Un)Charted
+
+
+ bookSection
+
+
+ ISBN 978-0-9952408-0-3
+ The Handbook of Learning Analytics
+
+
+
+
+
+
+ Vancouver, BC
+
+
+ Society for Learning Analytics Research
+
+
+
+
+
+
+ Liu
+ R
+
+
+
+
+ Koedinger
+ K
+
+
+
+
+
+
+ https://solaresearch.org/hla-17/hla17-chapter6/
+
+
+ 1
+ 69-76
+ March 2017
+ EN
+ Going Beyond Better Data Prediction to Create Explanatory Models of Educational Data
+
+
+ journalArticle
+
+ Significance
+
+
+
+
+
+ Gelman
+ A
+
+
+
+
+ Niemi
+ J
+
+
+
+
+ 134-136
+ September 2011
+ Statistical graphics: making information clear – and beautiful
+
+
+ journalArticle
+
+
+ 38
+ 2
+ The American Statistician
+
+
+
+
+
+
+ Wainer
+ H
+
+
+
+
+ 137-147
+ 1984
+ How to display data badly
+
+
+ journalArticle
+
+
+
+
+
+
+
+ Gelman
+ A
+
+
+
+
+ Unwin
+ A
+
+
+
+
+ 2012
+ Infovis and Statistical Graphics: Different Goals, Different Looks (with discussion)
+
+
+ blogPost
+
+ Junkcharts
+
+
+
+
+
+ Fung
+ K
+
+
+
+
+
+
+ http://junkcharts.typepad.com/junk_charts/junk-charts-trifecta-checkup-the-definitive-guide.html
+
+
+ 2014
+ Blog
+ Junkcharts Trifecta Checkup: The Definitive Guide
+
+
diff --git a/README.html b/README.html
deleted file mode 100644
index 3a22094..0000000
--- a/README.html
+++ /dev/null
@@ -1,507 +0,0 @@
-
-
-
-
-
New class motto: “If its not messing up, its not technology”
-
The Internet and mobile computing are changing our relationship to data. Data can be collected from more people, across longer periods of time, and a greater number of variables, at a lower cost and with less effort than ever before. This has brought opportunities and challenges to many domains, but the full impact on education is only beginning to be felt. On the one hand there is a critical mass of educators, technologists and investors who believe that there is great promise in the analysis of this data. On the other, there are concerns about what the utilization of this data may mean for education and society more broadly. Data Science in Education provides an overview of the use of new data cources in education with the aim of developing students’ ability to perform analyses and critically evaluate the technologies and consequences of this emerging field. It covers methods and technologies associated with Data Science, Educational Data Mining and Learning Analytics, as well as discusses the opportunities for education that these methods present and the problems that they may create.
-
No previous experience in statistics, computer science or data manipulation will be expected. However, students will be encouraged to get hands-on experience, applying methods or technologies to educational problems. Students will be assessed on their understanding of technological or analytical innovations and how they critique the consequences of these innovations within the broader educational context.
-
-
-
Course Goals
-
The overarching goal of this course is for students to acquire the knowledge and skills to be intelligent producers and consumers of data science in education. By the end of the course students should: * Systematically develop a line of inquiry utilizing data to make an argument about learning * Be able to evaluate the implications of data science for educational research, policy, and practice
-
This necessarily means that students become comfortable with the educational applications of three domain areas: computer science, statistics and the context surrounding data use. There is no expectation for students to become experts in any one of these areas but rather the course will aim to: enhance student competency in identifying issues at the level of data acquisition, data analysis and application of analysis in education.
-
-
-
Assessment
-
In EDCT-GE 2550 students will be attempting several data science projects, however, unlike most courses, students will not be asssessed based on how successful they are in completing these projects. Rather students will be assessed on two key components for future sucess: contribution and organization. Contribution reflects the extent to which students participate in the course, how often they tweet, whether or not they complete assignments and quizzes, attend class, etc. Organization reflects how well students document their process and maintain data and software resources. For example, maintaining a well organized Zotero library with notes, maintaining a well organized Github account and maintaining organized data sets that are labelled appropriately. To do well in EDCT-GE 2550 requires that students finish the course with the resources to sucessfully use data science in education in the future. Do the work and stay organized and all will be well!
-
Tasks that need to be completed during the semester:
-
-
Attend class
-
Weekly readings
-
Comment on readings on Twitter
-
Weekly in class questionnaire
-
Maintain documentation of work (Github, R Markdown, Zotero)
-
Ask one question on Stack Overflow
-
In person meeting with instructor
-
8 short assignments (including one group assignment)
-
Group presentation of group assignment, 3-5 students each
-
Produce one argument about learning using data from the class