Hadoop Engineer - global eCommerce leader - Bangkok

Location
Bangkok, Thailand
Salary
Negotiable
Posted
01 Sep 2016
Closes
29 Sep 2016
Ref
16TA14G001P08
Contact
Stefania Cook
Job function
IT
Hours
Full Time
Contract Type
Permanent
Hadoop Engineer - global eCommerce leader - Bangkok


Requirements:
  • Bachelor's degree in Computer Science /Information Systems/Engineering/related field
  • At least 3 years' experience working with modern systems languages
  • Experience debugging and reasoning about production issues
  • A good understanding of data architecture principles

Bonuses:

  • Any experience with 'Big Data' technologies / tools
  • Strong systems administration skills in Linux
  • Strong experience in JVM tuning
  • Python/Shell scripting skills also a plus
  • Experience working with open source products
  • Working in an agile environment using test driven methodologies.

Role:

My client's Bangkok team is looking for top quality passionate Hadoop Operations engineers to test, deploy and manage our next generation Hadoop platform. They have multiple clusters spanning multiple data centres that are already handling millions of writes per second with petabytes of data and growing at an exponential rate.

Their systems scale across multiple data centers, totaling a few million writes per second and managing petabytes of data. They deal with problems from real-time data-ingestion, replication, enrichment, storage and analytics. They are not just using Big Data technologies, they are pushing them to the edge.

This company is the largest and fastest growing online hotel booking platform in Asia and are part of the largest online travel company in the world. They have the dynamism and short chain of command of a startup and the capital of a blue-chip to make things happen. In this competitive world of online travel agencies, finding even the slightest advantage in the data can make or break a company. That is why they put data systems in their top priorities.

While they are proud of what they've built so far, there is still a long way to go to fulfill their vision of data. They are looking for people who are as excited about data technology as they are, to join their fight. You can be part of designing, building, deploying (and probably debugging) products across all aspects of their core data platform products.

Responsibilities:

  • Manage, adminster, troubleshoot and grow multiple Hadoop clusters
  • Build automated tools to solve operational issues
  • Run effective POC's on new platform products that can grow the list of services they offer

A few of the technologies we use

Spray, Hadoop, Kibana, ElasticSearch, Yarn, Akka, Mesos, Kafka, Sensu, Redis, Scala, Python, Cassandra, Postgres, Spark, OpenStack, Logstash, Couchbase, Vertica, Grafana, Go.

Company:
Our client is one of the leading and fastest growing IT employers in Southeast Asia. With elite talent from over 65 countries, the atmosphere of a start-up and the muscle of a $3bn business, this company is an incredible technical creative melting pot. They believe very much in the new generation of programmers. Dynamic, innovative, curious, aware of the latest technologies and programming techniques and at the same time willing to learn - and challenge - their experienced developers already on board.

We welcome both local and international applications for this role. Full visa sponsorship and relocation assistance available for eligible candidates.


We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.