Opportunities for Hadoopers are infinite - from a Hadoop Developer, to a Hadoop Tester or a Hadoop Architect, and so on. If cracking and managing BIG Data is your passion in life, then think no more and Join Infoschool Hadoop courses and carve a niche for yourself! Happy Hadooping!
Greetings from Julian
The course covers the following modules:
HDFS, Setting up Hadoop Cluster, Map-Reduce, PIG, HIVE, HBASE, ZOOKEEEPER and SQOOP.
We offer Java courses along with Hadoop Development Course to get you going.
The complete course curriculum with topic level details can be found at www.infoschool.in
After the completion of the Hadoop Course at Infoschool, you should be able to:
Master the concepts of Hadoop Distributed File System.
Understand Cluster Setup and Installation.
Understand Map Reduce and Functional programming.
Implement HBase, Map Reduce Integration, Advanced Usage and Advanced Indexing.
Have a good understanding of ZooKeeper service and Sqoop.
Develop a working Hadoop Architecture.
Hadoop Developer Module
Hardware Requirements :-- Systems must have atleast 2gb RAM.
Software Requirements :-- I will provide all softwares (Operating System also).
Basics & Installations
Hadoop Processes ( NN, SNN, JT, DN, TT)
Hadoop File System
Common errors when running hadoop cluster, solutions
HDFS-Hadoop distributed File System
HDFS Design and Architecture
Interacting HDFS using command line
Interacting HDFS using Java APIs
Joining datasets in Mapreduce jobs
Map reduce – customization
Custom Input format class
Custom Output format class
Hadoop Programming Languages :-
Secondary name node
Developing Map Reduce Application
Phases in Map Reduce Framework
Map Reduce Input and Output Formats
Installation and Configuration
Interacting HDFS using PIG
Map Reduce Programs through PIG
Loading, Filtering, Grouping….
Data types, Operators…..
Sample programs in PIG with Real time
b. Installation and Configurations
d. interacting HDFS using Hive
e. Mapreduce programs through Hive
g. Sample Programs. With realtime
NOSQL Databases Concepts
Baiscs & Installations
Interacting Habse with HDFS
Basics & Installations
All queries for processing data
ETL tool ( Data Warehousing BI Tools) :--
Creating RDBMS database
Establishing Connection between PDI to RDMS database
Creating data in hadoop
Establishing Connection between PDI to Hadoop data
Moving data from hadoop to RDBMS
Mission Values Our technical offerings – by industry experts and professionals.
Web Development (Web 2.0, PHP, MySQL, Ajax, CSS)
Oracle Database Administration
Microsoft SQL Server Database Administration
Dot Net Framework
VMware vSphere Suite
C/C++ and Java Programming
Microsoft Office (Word, Excel, Power Point, Share Point, Exchange)
SAS Base and Advance
SAS Analytics and SAS Finance
Hadoop Big Data
Data Warehousing With Cognos, Informatica and MSBI