DUCAT is the most perfect place where anyone hopeful to learn Big data Hadoop can be at. Our teaching offers the complex insight in such a mode that anyone can learn the benefit and difficulty and be an expert. The "Big data Hadoop" creates open-source software for dependable, scalable, spread computing. Bigdata Hadoop has been the dynamic force behind the enlargement of the big data production. Hadoop brings the aptitude to inexpensively process large amounts of data, regardless of its construction. By large, we indicate from 10-100 gigabytes and above. A learner gets the likelihood to learn all technical details with DUCAT and become a power in no time. DUCAT has planned a variety of teaching programs depending on popular need and time. This course in special is arranged in such a way that it completes the absolute training within a short time and saves money and valuable time for people. It can be very helpful for people who are by now working. The education staffs of DUCAT believe in building a beginner from base and making an expert of them. Various forms of education are conducted; test, mock tasks and practical issue solving lessons are undertaken. The realistic based training modules are mainly planned by DUCAT to bring out a specialist out of all.

INTRODUCTION TO BIG DATA

  • Characteristics of Big Data
  • Big data collection and cleanup
  • Why analyze big data
  • Why parallel computing important
  • Various products for handling big data

INTRODUCING HADOOP

  • Hadoop Stack
  • Components of Hadoop
  • Starting Hadoop
  • Various Hadoop processes
  • Hands On

WORKING WITH HDFS

  • Basic file commands
  • Reading & writing to files
  • Run a word count on a large text file
  • Web based UI
  • View jobs status on Hadoop prompt
  • View jobs status on web UI
  • High availability
  • Federation
  • Hands On

YARN

  • Architecture
  • Scheduler
  • Resource Manager
  • Yarn Hands On

INSTALLATION & CONFIGURING HADOOP

  • Types of installation (standalone, distributed)
  • Hadoop distributions (Apache, cloudera and hortonworks)
  • Setup linux for Hadoop installation (Java and SSH)
  • Haddop directory structure
  • XML, masters and slave files
  • Checking system health
  • Checking file system health
  • Block size, replication factor and block health monitoring
  • Benchmarking cluster
  • Hands On

ADVANCED ADMINISTRATION ACTIVITIES

  • Superuser
  • Authorization
  • Secure Mode
  • Adding and de-commissioning nodes
  • Secondary NameNode
  • Failover
  • Manage Quotas
  • Enabling Thrash
  • Hands On

MONITORING HADOOP CLUSTER

  • Hadoop infrastructure monitoring
  • Hadoop specific monitoring
  • Install and configure Nagios / Ganglia
  • Capture metrics
  • Hands on

OTHER COMPONENTS OF HADOOP ECOSYSTEM

  • Discuss Hive, Sqoop, Pig, HBase, Flume
  • Use cases of each
  • Use Hadoop streaming to write code in Perl /
  • Python
  • Hands on
COMMENCING NEW BATCHES
ENQUIRY FORM
FOLLOW US ON
SUBSCRIBE TO OUR NEWSLETTER

WE ACCEPT ONLINE PAYMENTS
PAY ONLINE