Apache Spark has become one of the most used frameworks for distributed data processing. Its mature codebase, horizontal scalability, and resilience make it a great tool to process huge amounts of data.
Apache Spark has become one of the most used frameworks for distributed data processing. Its mature codebase, horizontal scalability, and resilience make it a great tool to process huge amounts of data.
Spark’s great power and flexibility requires a developer that does not only know the Spark API well: They must also know about the pitfalls of distributed storage, how to structure a data processing pipeline that has to handle the 5V of Big Data—volume, velocity, variety, veracity, and value—and how to turn that into maintainable code.
Spark Developer - Job Description and Ad Template
Copy this template, and modify it as your own:
Company Introduction
{{ Write a short and catchy paragraph about your company. Make sure to provide information about the company’s culture, perks, and benefits. Mention office hours, remote working possibilities, and everything else that you think makes your company interesting. }}
Job Description
We are looking for a Spark developer who knows how to fully exploit the potential of our Spark cluster.
You will clean, transform, and analyze vast amounts of raw data from various systems using Spark to provide ready-to-use data to our feature developers and business analysts.
This involves both ad-hoc requests as well as data pipelines that are embedded in our production environment.
Responsibilities
Create Scala/Spark jobs for data transformation and aggregation
Produce unit tests for Spark transformations and helper methods
Write Scaladoc-style documentation with all code
Design data processing pipelines
Skills
Scala (with a focus on the functional programming paradigm)
Toptal is a marketplace for top Spark developers, engineers, programmers, coders, architects, and consultants. Top companies and startups choose Toptal Spark freelancers for their mission-critical software projects.
With a Ph.D. in electrical engineering and extensive experience in building machine learning applications, Andreas spans the entire AI value chain, from use case identification and feasibility analysis to implementation of custom-made statistical models and applications. Throughout projects, he stays focused on solving the business problem at hand and creating value from data.
Steve is a certified AWS solution architect professional with big data and machine learning speciality certifications. He has a diverse background, and experience architecting, building, and operating big data machine learning applications in AWS. Steve has held roles from technical contributor to CTO and CEO.
United Arab EmiratesToptal Member Since December 6, 2019
Luigi is a seasoned cloud and leadership specialist with over two decades of professional experience in a variety of environments. He is passionate about technology and value-driven projects, and he is highly adaptable. Luigi has been part of significant industry transformation waves directly from some of the leaders driving the digital era.
As a highly effective technical leader with over 20 years of experience, Andrew specializes in data: integration, conversion, engineering, analytics, visualization, science, ETL, big data architecture, analytics platforms, and cloud architecture. He has an array of skills in building data platforms, analytic consulting, trend monitoring, data modeling, data governance, and machine learning.
Having studied advanced machine learning (ML) theory for the past three years, it’s safe to say Yuxiang knows ML quite well and he's delivered multiple projects using cutting-edge ML algorithms and tools. While at school, he also spent two years researching NLP. With a solid knowledge base in ML and NLP, hands-on experience, and exemplary communication skills—both written and verbal—Yuxiang will add value to your project.
Czech RepublicToptal Member Since November 21, 2019
Ivan has experience working as a data scientist and a data engineer in network security and finance industries. This includes processing and cleaning data, formalizing business problems and creating solutions by designing features and applying machine learning techniques to solve the problems. He works with big data using Spark and MapReduce, and can visualize and present results to stakeholders in an easy-to-understand format.
Diego is a computer science licentiate with more than 15 years of experience. He's worked for companies of all sizes, both on-site and remotely, mainly as senior developer/architect (programming in C/C++, Python and recently Go), and as a technical leader for small teams of programmers. He has a problem-solving attitude and likes to use the most suitable tool for each task. He's a co-author of two patents and a few research publications.
Weidong Ding has proven experience as a senior data/integration architect, recently focusing on SAP Data Services. He's detailed, hands-on, and efficient with comprehensive background planning, designing, and implementing information systems for leading organizations in the banking, transportation, retail, and government sectors. He leverages strong communication and customer service skills, working with clients and colleagues to achieve success.