Interested in increasing your knowledge of the Big Data landscape? This course is for those new to data science and interested in understanding why the Big Data Era has come to be. It is for those who want to become conversant with the terminology and the core concepts behind big data problems, applications, and systems. It is for those who want to start thinking about how Big Data might be useful in their business or career. It provides an introduction to one of the most common frameworks, Hadoop, that has made big data analysis easier and more accessible — increasing the potential for data to transform our world!At the end of this course, you will be able to:
* Describe the Big Data landscape including examples of real world big data problems including the three key sources of Big Data: people, organizations, and sensors.
* Explain the V’s of Big Data (volume, velocity, variety, veracity, valence, and value) and why each impacts data collection, monitoring, storage, analysis and reporting.
* Get value out of Big Data by using a 5-step process to structure your analysis.
* Identify what are and what are not big data problems and be able to recast big data problems as data science questions.
* Provide an explanation of the architectural components and programming models used for scalable big data analysis.
* Summarize the features and value of core Hadoop stack components including the YARN resource and job management system, the HDFS file system and the MapReduce programming model.
* Install and run a program using Hadoop!
This course is for those new to data science. No prior programming experience is needed, although the ability to install applications and utilize a virtual machine is necessary to complete the hands-on assignments.
Hardware Requirements:
(A) Quad Core Processor (VT-x or AMD-V support recommended), 64-bit; (B) 8 GB RAM; (C) 20 GB disk free. How to find your hardware information: (Windows): Open System by clicking the Start button, right-clicking Computer, and then clicking Properties; (Mac): Open Overview by clicking on the Apple menu and clicking “About This Mac.” Most computers with 8 GB RAM purchased in the last 3 years will meet the minimum requirements.You will need a high speed internet connection because you will be downloading files up to 4 Gb in size. Software Requirements: This course relies on several open-source software tools, including Apache Hadoop. All required software can be downloaded and installed free of charge. Software requirements include: Windows 7+, Mac OS X 10.10+, Ubuntu 14.04+ or CentOS 6+ VirtualBox 5+.
-
Week 1 - Big Data: Why and Where
-
What’s in Big Data Applications and Systems?
-
By the end of this course you will be able to…
-
What a Supercomputer is?
-
What launched the Big Data era?
-
Applications: What makes big data valuable
-
Example: Saving lives with Big Data
-
Example: Using Big Data to Help Patients
-
A Sentiment Analysis Success Story: Meltwater helping Danone
-
Did you know? 25 Facts about Big Data
-
Slides: What launches the Big Data Era?
-
Slides: What makes Big Data Valuable?
-
Slides: Saving Lives with Big Data
-
Slides: Using Big Data to help Patients
-
Getting Started: Where Does Big Data Come From?
-
Machine-Generated Data: It’s Everywhere and There’s a Lot!
-
Machine-Generated Data: Advantages
-
Big Data Generated By People: The Unstructured Challenge
-
Big Data Generated By People: How Is It Being Used?
-
Organization-Generated Data: Structured but often siloed
-
Organization-Generated Data: Benefits Come From Combining With Other Data Types
-
The Key: Integrating Diverse Data
-
Who are you providing data to?
-
Why Big Data and Where Did it Come From?
-
Extra Resources
-
Supplementary Resources – Slides
-
-
Week 2 - Characteristics of Big Data and Dimensions of Scalability
-
Getting Started: Characteristics Of Big Data
-
Characteristics of Big Data – Volume
-
What dose Astronomical Scale mean?
-
Characteristics of Big Data – Variety
-
Characteristics of Big Data – Velocity
-
Characteristics of Big Data – Veracity
-
Characteristics of Big Data – Valence
-
The Sixth V: Value
-
A “Small” Definition of Big Data
-
V for the V’s of Big Data
-
Practice: Writing Big Data questions
-
Practice: Improving the Flamingo Game
-
Supplementary Resources – Slides
-
Data Science: Getting Value out of Big Data
-
Building a Big Data Strategy
-
How does big data science happen?: Five Components of Data Science
-
5 P’s of Data Science
-
Practice: Thinking more deeply about the Ps
-
Asking the Right Questions
-
Steps in the Data Science Process
-
Step 1: Acquiring Data
-
Step 2-A: Exploring Data
-
Step 2-B: Pre-Processing Data
-
Step 3: Analyzing Data
-
Step 4: Communicating Results
-
Step 5: Turning Insights into Action
-
Practice: Building a Team
-
Data Science 101
-
Supplementary Resources – Slides
-
-
Week 3 - Foundations for Big Data Systems and Programming
-
Getting Started: Why worry about foundations?
-
What is a Distributed File System?
-
Scalable Computing over the Internet
-
Programming Models for Big Data
-
Foundations for Big Data
-
Supplementary Resources – Slides
-
Hadoop: Why, Where and Who?
-
The Hadoop Ecosystem: Welcome to the zoo!
-
The Hadoop Distributed File System: A Storage System for Big Data
-
YARN: A Resource Manager for Hadoop
-
MapReduce: Simple Programming for Big Results
-
Practice: MapReduce
-
When to Reconsider Hadoop?
-
Cloud Computing: An Important Big Data Enabler
-
Cloud Service Models: An Exploration of Choices
-
Value From Hadoop and Pre-built Hadoop Images
-
Supplementary Resources – Slides
-
Intro to Hadoop
-
Assignment -Understand by Doing: MapReduce
-
Hands-on: Download and Install Hadoop (Mac)
-
Hands-on: Download and Install Hadoop (Windows)
-
Hands-on: Download and Install Hadoop FAQ
-
Copy your data into the Hadoop Distributed File System (HDFS)
-
Copy your data into the Hadoop Distributed File System (HDFS)
-
Run the WordCount Program Instructions
-
Run the WordCount program
-
Running Hadoop MapReduce Programs Quiz
-
Practice: Map Reduce in your life
-
Hands-on: Optional Materials
-