Expiring

Big Data/Hadoop Developer

TPFSoftware

Save this job
Save job
Unlock your match score
Estimated
~ $100,000
Work type
Full Time
Contract type
Permanent
How this job matches you
% match
Get your match score for free
Pay guide
Market insights based on all Hadoop Developer jobs in Sydney NSW
Estimated range
$100k
Market average
$199k
$199k
Market average
$100k
$250k+
Skills
APACHE HADOOP IMPALA
APACHE HADOOP PIG/HIVE
BigData
JSON
KERBEROS
LDAP
LINUX
RESTFUL WEB SERVICES
Zepplin

Full job description

Role : Big data Hadoop Developer

Company : www.tpfsoftware.com

Experience : 5-8 years

Work location : Work location : Sydney, NSW

Australia Preferably : Native Australian's or Australian PR holders can Apply.

Type : Fulltime / Permanent

Responsibilities :


- Work directly with customer's technical resources to devise and recommend solutions based on the understood requirements

- Analyze complex distributed production deployments, and make recommendations to optimize performance

- Able to document and present complex architectures for the customers technical teams

- Help design and implement Hadoop architectures and configurations for customer

- Drive projects with customers to successful completion

- Write and produce technical documentation, knowledge base articles

- Keep current with the Hadoop Big Data ecosystem technologies

- Attend speaking engagements when needed

- Travel up to 75%

Qualifications :

- More than two years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions

- Experience designing and deploying production large-scale Hadoop solutions

- Ability to understand and translate customer requirements into technical requirements

- Experience designing data queries against data in a Hadoop environment using tools such as Apache Hive, Apache Druid, Apache Phoenix or others.

- Experience installing and administering multi-node Hadoop clusters

- Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment

- Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos

- Strong understanding of network configuration, devices, protocols, speeds and optimizations

- Strong understanding of the Java development, debugging & profiling

- Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP

- Solid background in Database administration and design, along with Data Modeling with star schema, slowing changing dimensions, and/or data capture.

- Experience in architecting data center solutions - properly selecting server and storage hardware based on performance, availability and ROI requirements

- Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.

- Excellent verbal and written communications

Nice, But Not Required Experience :

- Knowledge of the data management eco-system including : Concepts of data warehousing, ETL, data integration, etc.

- Hortonworks/Cloudera Certified - Admin and/or Developer or Data Science experience

- Familiarity with Data Science notebooks such as Apache Zepplin, Jupyter or IBM DSX

- Demonstrable experience implementing machine learning algorithms using R or Tensorflow

- Automation experience with Chef, Puppet, Jenkins or Ansible

- Familiarity with scripting tools such as bash shell scripts, Python and/or Perl

- Experience with Cloud Platforms & deployment automation

Job details
Date posted
13 Jul 2020
Expiring date
12 Aug 2020
Category
Information Technology
Occupation
Developers/Programmers
Estimated
~ $100,000
Contract type
Permanent
Work type
Full Time
Job mode
Standard/Business Hours
Career level
Executive
Industry
Consulting Services
Sector
Private Business
Desired education level
Bachelor's Degree, Master's Degree
Work Authorisation
Australian Citizen/Permanent Resident
Company size
51 to 200

You may also be interested in these jobs