AWS Certified Big Data – Specialty Complete Video Course and Practice Test

AWS Certified Big Data – Specialty Complete Video Course and Practice Test

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 9h 41m | 2.35 GB

AWS leads the world in cloud computing and big data. This course offers the complete package to help practitioners master the core skills and competencies needed to build successful, high-value big data applications, with a clear path toward passing the certification exam AWS Certified Big Data – Specialty.

This course provides a solid foundation in all areas required to pass the AWS Certified Big Data Specialty Exam–including Collection, Storage, Processing, Analysis, Visualization, and Data Security. In addition, multiple quizzes and a practice exam prepare the student for the formal Certification Exam administered by AWS.

This course provides case study—based training, designed completely around Jupyter notebook—based learning using only AWS big data technologies. Every exercise shown in this video can be run interactively by the students watching. The material exclusively focuses on AWS, with the goal of building enough foundation that the learner can achieve certification.

Most companies struggle with widely varied, high-volume, and fast-moving data. After years of hype around big data, tools and infrastructure have improved to the point where companies are way beyond pilot projects and proofs-of-concept. Big data, where parallel processing is needed just to do the work, is the new normal. AWS leads the world in cloud computing and big data. This course offers a clear path toward certification by AWS on Big Data solutions, which is needed in a competitive job market. This course fills a known gap in this rapidly growing space.

What You Will Learn

  • Learn how to perform collection tasks on AWS
  • Learn how to use the appropriate storage solution for Big Data on AWS
  • Learn how to perform processing tasks on the AWS platform
  • Learn how to couple Visualization, Analysis, and Data Security to reason about Big Data on AWS
  • Learn how to think about the AWS Big Data Certification exam to optimize for the best outcome.

Who Should Take This Course

  • You are a DevOps Engineer who wants to understand how to operationalize Big Data workloads.
  • You are a Software Engineer who wants to master Big Data terminology and practices on AWS.
  • You are a Machine Learning Engineer who wants to solidify their knowledge of AWS Big Data practices.
  • You are a Product Manager who needs to understand the AWS Big Data lifecycle.
  • You are a Data Scientist who runs Big Data workloads on AWS.
Table of Contents

01 Learning objectives
02 1.1 Learn the answer to “What is Big Data”
03 1.2 Explore the history of Big Data
04 1.3 Know AWS Certification – 6 domain areas
05 1.4 Understand the AWS Certification Exam – blueprint
06 1.5 Learn an exam strategy
07 1.6 Identify focus areas
08 1.7 Learn exam tips & tricks
09 1.8 Learn how to register for an AWS Certification Exam
10 Learning objectives
11 2.1 Introduction _ overview
12 2.2 Concepts
13 2.3 Approaches to data collection
14 2.4 Scenario 1
15 2.5 Scenario 2
16 2.6 Scenario 3
17 2.7 Demo – AWS Kinesis
18 2.8 Review _ conclusion
19 Learning objectives
20 3.1 Optimize the operational characteristics of the storage solution
21 3.2 Determine data access and retrieval patterns
22 3.3 Evaluate mechanisms for capture, update, and retrieval of catalog entries
23 3.4 Determine appropriate data structure and storage format
24 3.5 Understand storage & database fundamentals
25 3.6 Learn S3 – storage
26 3.7 Understand Glacier – backup & archive
27 3.8 Create AWS Glue – data catalog
28 3.9 Use Dynamodb
29 4.1 Identify the appropriate data processing technology for a given scenario
30 4.2 Design and architect the data processing solution
31 4.3 Determine the operational characteristics of the solution implemented
32 4.4 Understand AWS processing – overview
33 4.5 Understand Elastic MapReduce (EMR)
34 4.6 Learn about Apache Hadoop
35 4.7 Apply EMR – architecture
36 4.8 Understand EMR – operations
37 4.9 Use EMR – Hive
38 4.10 Use EMR – Hbase
39 4.11 Use EMR – Presto
40 4.12 Use EMR – Spark
41 4.13 Implement EMR – storage & compression
42 4.14 Implement EMR – Lambda
43 Learning objectives
44 5.1 Determine how to design and architect the analytical solution
45 5.2 Understand Redshift overview
46 5.3 Learn Redshift design
47 5.4 Use Redshift data ingestion
48 5.5 Apply Redshift operations
49 5.6 Use AWS Elasticsearch – operational analytics
50 5.7 Understand Machine Learning – clustering and regression
51 5.8 Use AWS Athena – interactive analytics
52 Learning objectives
53 6.1 Overview of AWS Quicksight
54 6.2 Design and create the Visualization platform
55 6.3 Optimize the QuickSight operations
56 6.4 Understand critical Quicksight limitations
57 Learning objectives
58 7.1 Data governance
59 7.2 AWS Shared Responsibility Model
60 7.3 Identity and Access Management (IAM)
61 7.4 Encryption
62 7.5 Configure VPC
63 7.6 Implement Redshift Security
64 7.7 Implement EMR Security
65 Learning objectives
66 8.1 Understand Big Data for Sagemaker
67 8.2 Learn Sagemaker and EMR Integration
68 8.3 Learn Serverless Production Big Data application development
69 8.4 Implement Containerization for Big Data
70 8.5 Implement Spot Instances for Big Data Pipeline
71 Learning objectives
72 9.1 Prepare for the exam
73 9.2 Walk through the exam sections
74 9.3 Review the exam
75 10.1 Wrap up
76 Summary