Technical Architect for Big Data and Machine Learning | Location: Gurgaon - IndiaJobzmagazine

Click & Cash

Breaking

Home Top Ad

Post Top Ad

Monday, 12 October 2015

Technical Architect for Big Data and Machine Learning | Location: Gurgaon

Overview

Job/Position Summary:
This position is responsible to design, write code and provide hands-on guidance to the rest of the team, Investigate and debug any systems, application, and core platform related issues in big data platforms. To determine the best technical implementation methods and execution schedules that will be helping develop big data solutions through software development and incorporate/use Apache Hadoop or any Hadoop distributions, work on data ingestion from external systems.
We are looking for strong and passionate hard-core techies. The incumbent should have:
Demonstrated experience and expertise with managing and operating production big-data storage and compute environments.
Demonstrated experience and expertise with architecting and supporting diverse and performant solutions for business reporting and analytics requirements over huge big-data infrastructure.
Understanding to come up with solutions to problems, design and architect, Building and collaborating with business and technical teams to deliver software.
Experience Extracting and consuming data from live systems to analyze and produce robust 99.99% SLA Big-data environment.
Passion for technology and willingness to learn is required.
Ability to work in a fast paced and dynamic work environment and be able to produce efficient and robust solution.
Primary Roles/Responsibilities:
Closely work with cross-functional teams to quickly prototype, design and implement ideas and projects.
Work in a teamwork environment that requires constant interaction with members from other teams with different level of technical expertise.
Explain, formalize, and defend ideas using design diagrams and documents.
Perform unit test and integration test for the code developed.
Follow stringent development schedule to meet internal customer delivery dates.
Involve in different phases of software development cycle depending on the need of the team.
Generate documentation and technical/functional specifications as needed.
Unit level testing including test plans, scripts etc.
General:
Experience in software architecture design, working knowledge of design patterns.
Experience in all phases of software development lifecycle: requirements analysis, tech design, implementation, code review, testing, and release.
Excellent level of analytical ability, communication and interpersonal skills required to build effective relationships with team members and customers to solve problems and resolve issues.
Self-motivated, critical thinker with strong technical background.
Proficient in verbal communication skills including presentation to large audiences.
Ability to work independently and be a self-starter.
Proficient in authoring, editing and presenting technical documents such as FRDs, TSDs and architecture documents.
Ability to effectively present information to team members and management, customers and prospects.
Must be self-starter, proactive and willing to learn and grow with the business.
Ability to work well under pressure and manage multiple tasks simultaneously.
Must demonstrate a hands-on approach, attention to detail and results orientation.
Work:
5+ years of Java development.
2+ years’ experience developing solutions on Big Data Platforms.
2+ experience with at least one Hadoop distribution.
2+ years’ experience with the various tools & frameworks that enable capabilities within the Hadoop ecosystem (MapReduce, YARN, Pig, Hive, Hbase, Oozie, Flume, Sqoop) and NoSQL.
Experience developing front-end and back-end solutions written in Java / J2EE.
2+ years’ Experience with Kafka, Storm, Spark, Sqoop and Flume.
Experience designing, developing, and implementing Hadoop/Hive ETL.
Experience with scripting languages (Python, Perl, JavaScript, Shell).
Experience with UNIX/Linux including basic commands and shell scripting.
Experience with data mining, machine learning, statistical modeling tools or underlying algorithms.
Good to have experience with data visualization tools (like Tableau) but not mandatory.
Experience: Worked on / Familiarity with 5+ year of experience.

About Ephesoft Inc.

Ephesoft{http://ephesoft.com/ what-we-do}
Ephesoft Smart Capture® intelligent document software utilizes powerful OCR algorithms to extract critical information from physical and electronic files.
Ephesoft, Inc. delivers our exclusive Smart Capture® intelligent capture technology on-premise or in a SaaS model. With a focus on efficiency and reliability, Ephesoft has crafted the next generation of intelligent document capture in a simple and cost-effective application. Ephesoft’s use of open standards and web-based software offers extreme flexibility and accessibility for a wide range of customers. Our superior classification technology allows organizations to automate and streamline mail room processing and other document-based business processes. Ephesoft is headquartered in Laguna Hills, California.
At Ephesoft, we create advanced capture products to help Enterprises of all sizes and industries identify and extract meaning from their documents.  More than 80% of business information is trapped in unstructured content—our goal is to make that data actionable.  We initiate processes that add value to a company’s existing content and information management platforms.  From automating mortgage applications, streamlining insurance claims, accelerating invoice payments and simplifying sales orders, Ephesoft allows our customers to put information to work and do business faster. This is why we were just named to the 2015 Inc. 500 list of fastest growing privately-held companies in the US.

No comments:

Post a Comment

Post Bottom Ad

Pages