With the belief to build a healthy ecosystem as per the Industry Standards REGex Software brings a Training/Internship Program on “Power BI & BigData”. We organize Summer Training/Internship Program for improving the knowledge and skills of the Students/Professionals, so that they can become expert in the field of BigData and get their Dream Job in Software Development Field in Big MNCs.
REGex Software Services’s Big-Data program is a valuable resource for both beginners and experts. This specialization program will introduce you to the domain of Data Engineering include Hadoop, Map-Reduce, HIVE, Apache Spark, Kafka Streaming ,SQL, Power BI, Amazon EMR and much more starting from Basics to Advance. If you want to become Data Engineer / Business Analyst, REGex introduce this program for you.
20 Hours Per week
Duration: 20 Hours
Duration: 60 Hours
Duration: 50 Hours
Duration: 20 Hours
Duration: 20 Hours
Duration: 40 Hours
Duration: 20 Hours
Duration: 20 Hours
Duration: 10 Hours
Live Sessions by Expertise Trainers and Access of Recorded Session is also available.
You can get Internship/Training Opportunities to get placed in HP, DELL, Honeywell, Rightpoint, Frontdoor, Fractal and many more according to your performance.
Best IT Training and Internship Company in Jaipur. Highly recommended. Supportive faculties, Management, online and offline sessions access with recording access help every student to concentrate more on learning. Practical Learning and working on live projects with team is a main key highlights of REGEX.
The experience of learning in the Institute is really good. I've joined the MERN full stack course doing well. Thanks to the mngt. To provide certified facility. Very helping in solving my queries and Institute provide me to practical knowledge and demo project to improve my skills..... Thnks to Regex software service
This training center is exceptional, providing me with extensive knowledge in various domains and technologies. I enrolled in the Python Django course eight months ago, where I learned website development. Prior to joining this coaching, I struggled with speaking English, but now I have gained the ability to communicate effectively. My experience has been extremely positive, and I strongly recommend joining The Regex Software Services at the earliest opportunity.
Competitive Programming is the best course they have - i am part of both python and C++ course. Cracked several interviews with their course, poll test & assignment are always new and beneficial. Best CP course you will find here, i hope this will be beneficial for you
Tushar sir is best in delivery. His approach is mind blowing. I have not found any gap although I am from U.S Lots of Big Data tools I have learnt like Hadoop, Hive, Spark, Sqoop & most amazingly Talend ETL Tools which was the most lovely part of training. every component is told in very simple terms with great practical approach
I recently joined Python Django(Web Development - Full Stack)Course About Course: - I must say instructor makes every concept simple to understand - No Copy Paste,Every line of code is explained - Even given Assignments to work on - Even given Projects to work on If you looking to learn Python Django I highly recommend to go for this course
I am from UK & loved the teaching. Competitive Programming was the best experience I had in coding. I can truly say the money I spend is worth it. Go for it guys!!
● What’s Big Data?
● Big Data: 3V’s
● Explosion of Data
● What’s driving Big Data
● Applications for Big Data Analytics
● Big Data Use Cases
● Benefits of Big Data
Functions
● History of Hadoop
● Distributed File System
● What is Hadoop
● Characteristics of Hadoop
● RDBMS Vs Hadoop
● Hadoop Generations
● Components of Hadoop
● HDFS Blocks and Replication
● How Files Are Stored
● HDFS Commands
● Hadoop Daemons
● Difference between Hadoop 1.0 and 2.0
● New Components in Hadoop 2.x
● YARN/MRv2
● Configuration Files in Hadoop 2.x
● Major Hadoop Distributors/Vendors
● Cluster Management & Monitoring
● Hadoop Downloads
● What is distributed computing
● Introduction to Map Reduce
● Map Reduce components
● How MapReduce works
● Word Count execution
● Suitable & unsuitable use cases for MapReduce
● Architecture
● Basic Syntax
● Import data from a table in a relational database into HDFS
● import the results of a query from a relational database into HDFS
● Import a table from a relational database into a new or existing Hive table
● Insert or update data from HDFS into a table in a relational database
● Define a Hive-managed table
● Define a Hive external table
● Define a partitioned Hive table
● Define a bucketed Hive table
● Define a Hive table from a select query
● Define a Hive table that uses the ORCFile format
● Create a new ORCFile table from the data in an existing non-ORCFile Hive table
● Specify the delimiter of a Hive table
● Load data into a Hive table from a local directory
● Load data into a Hive table from an HDFS directory
● Load data into a Hive table as the result of a query
● Load a compressed data file into a Hive table
● Update a row in a Hive table
● Delete a row from a Hive table
● Insert a new row into a Hive table
● Join two Hive tables
● Use a subquery within a Hive query
● What is Spark?
● History of Spark
● Spark Architecture
● Spark Shell
● RDD Basics
● Creating RDDs in Spark
● RDD Operations
● Passing Functions to Spark
● Transformations and Actions in Spark
● Spark RDD Persistence
● Pair RDDs
● Transformations on Pair RDDs
● Actions Available on Pair RDDs
● Data Partitioning (Advanced)
● Loading and Saving the Data
● Accumulators
● Broadcast Variables
● Piping to External Programs
● Numeric RDD Operations
● Spark Runtime Architecture
● Deploying Applications
Data Frame
What is Spark streaming?
Spark Streaming example
● Introduction of HBase
● Comparison with traditional database
● HBase Data Model (Logical and Physical models)
● Hbase Architecture
● Regions and Region Servers
● Partitions
● Compaction (Major and Minor)
● Shell Commands
● HBase using APIs
NO SQL
● Pre-requisites
● Introduction
● Architecture
● Installation and Configuration
● Repository
● Projects
● Metadata Connection
● Context Parameters
● Jobs / Joblets
● Components
● Important components
● Aggregation & working with Input & output data
● Pseudo Live Project (PLP) program is primarily to handhold participants who are fresh into the technology. In PLP, more importance given to “Process Adherence”
● The following SDLC activities are carried out during PLP
o Requirement Analysis
o Design ( High Level Design and Low Level Design)
o Design of UTP(Unit Test Plan) with test cases
o Coding
o Code Review
o Testing
o Deployment
o Configuration Management
o Final Presentation
Note: Content may Subject to Change by REGex as per Requirement
Other Industrial Internship/Training Programs
For Webinar Videos and Demo Session, Join our Youtube Channel
WhatsApp us