IT Versity

making IT resourceful

  • Courses
  • Blog
  • Labs
  • Discuss

May 3, 2017 By Training Itversity 1 Comment

Ambari and MySQL

Topic Progress:
← Back to Lesson

As part of this topic, we will review below tools in the lab.

  • Ambari – to understand the cluster configuration
  • MySQL Database

You can get lab credentials from here (provided you have subscription)

Ambari

  • Ambari comes with Hortonworks platform
  • Our lab is setup using Hortonworks platform
  • You need to sign into Big Data labs website to get lab credentials (after subscription)
  • Quick links can be seen as part of this URL – Lab page (requires subscription)
  • Also you can go to ambari page, by clicking here
  • Credentials from lab page need to be used
  • Below video will cover details about the lab. This video is very important.
    • Services on Dashboard
    • Hosts

MySQL Database

  • You need to sign into Big Data labs website to get lab credentials (after subscription)
  • Wondering why we are talking about relational database such as MySQL in Big Data?
  • At times we need to ingest data from relational databases to Hadoop
  • In our lab we provide MySQL server and several databases to practice.
  • We will be setting up additional databases in future

Related

← Previous Topic

Filed Under: Uncategorized

Comments

  1. Mahesh says

    April 6, 2018 at 2:37 am

    I am a recent user and when clicking on Amabri as shown. It tries to go to http://gw01.itversity.com:8080/ But ends up showing connection error. So diasppointed to see this on the first day. Please respond to me fast.

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Socially Connected

  • Facebook
  • Google+
  • Linkedin
  • Twitter
  • YouTube
Getting Started
  • Setup Scala and IDE
  • Preview of itversity platforms
  • Ambari and MySQL
  • Overview of HDFS
  • Getting started with Spark
  • Review of Sqoop and Hive
Scala Fundamentals for Spark
  • Getting Started with Scala
  • Basic programming constructs
  • Object Oriented Programming
  • Collections and Map Reduce
  • I/O Operations and Tuples
  • Development Life Cycle - sbt and scala
  • Application Development using IntelliJ
Data Ingestion - Apache Sqoop
  • Validating MySQL and Environment
  • Querying using list and eval commands
  • Sqoop Import - Simple import and execution life cycle
  • Sqoop Import - Customizing split logic
  • Sqoop Import - File Formats and Compression
  • Sqoop Import - Customizing filtering of data
  • Sqoop Import - Delimiters and handling nulls
  • Sqoop Import - Incremental loads
  • Sqoop Import - Hive Import
  • Sqoop Import - Import all tables
  • Sqoop - Typical life cycle
  • Sqoop Export - Simple Export
  • Sqoop Export - Upsert/merge
Core Spark using Scala
  • Getting Started
  • Creating RDD from HDFS
  • Transformations Overview
  • Row level transformations
  • Filtering the data
  • Joining data sets
  • Performing aggregations
  • Global Sorting and Ranking
  • By Key Ranking
  • Set Operations
  • Save data to HDFS
Data Frames and Spark SQL
  • Create tables and loading data
  • Spark SQL - Functions
  • Spark SQL - Standard Transformations
  • Spark SQL - Analytics and Windowing Functions
  • Spark SQL - Processing data using Data Frames
Streaming Analytics - using Flume and Kafka
  • Flume - Getting Started
  • Flume - Web Server logs to HDFS
  • Kafka - Getting Started
  • Spark Streaming - Getting Started
  • Spark Streaming - Another Example
  • Flume and Spark Streaming Integration - Example
  • Flume and Kafka Integration - Example
  • Kafka and Spark Streaming Integration - Example
Tips and Evaluation

Copyright © 2018 · Education Pro Theme On Genesis Framework · WordPress · Log in