Find Jobs
Hire Freelancers

Big Data and Hadoop Support

₹12500-37500 INR

Cerrado
Publicado hace más de 5 años

₹12500-37500 INR

Pagado a la entrega
Working on a hadoop project which involves Spark with scala, hive , impala, and sqoop. Looking for daily 2 hours of support Monthly Payout INR 20000 Please mention experience with all these technologies.
ID del proyecto: 17732704

Información sobre el proyecto

16 propuestas
Proyecto remoto
Activo hace 6 años

¿Buscas ganar dinero?

Beneficios de presentar ofertas en Freelancer

Fija tu plazo y presupuesto
Cobra por tu trabajo
Describe tu propuesta
Es gratis registrarse y presentar ofertas en los trabajos
16 freelancers están ofertando un promedio de ₹22.882 INR por este trabajo
Avatar del usuario
Hi I am a data engineer with 3+ years of experience in the industry. I have worked on deploying highly scalable, resilient and durable solutions using big data tech both in clouds and on-premise. I have expertise in following areas 1. Spark pipelines in aws and on premise 2. Data ingestion to hdfs and hive using sqoop and vice versa 3. Streaming analytics with Kafka and spark streaming 4. Scala, sbt 5. Hdp and HDF distribution both admin and developer track Looking forward to hearing from you.
₹25.000 INR en 10 días
4,9 (22 comentarios)
5,1
5,1
Avatar del usuario
I have 2 years of experience in all these technologies and have certification in Spark and Hadoop ecosystem as well. Lets discuss in chat to finalize the deal.
₹22.222 INR en 10 días
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
i am certified Hadoop developer worked on many projects.
₹18.888 INR en 20 días
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
Has been working on these technologies since a year and has gained a decent amount of knowledge . Hadoop - 1.6 Years Sqoop - 6 months Spark With Java - 1 Year Scala - 3 months Hive - 1 year
₹15.555 INR en 15 días
0,0 (1 comentario)
0,0
0,0
Avatar del usuario
What would be the kind of work? Data scrubs, transformations? or what kind of processing you would need to perform?
₹22.222 INR en 10 días
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
Currently, i am working on the same skills as you required having 1.5 years of experience. I am graduated from one of the India's top most institute's called NIT's. I am workaholic. I can support up to 6 months. Relevant Skills and Experience Spark,Hive,Sqoop,Scala, Java,Sql,Linux
₹22.222 INR en 90 días
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
• Possess 2years of analysis and development experience in working projects and prototypes. • Hands on experience on major components of Hadoop ecosystem like Apache Spark, Map Reduce, HDFS, HIVE, PIG, Sqoop and HBASE. • Implemented apache spark procedures like test analytics and processing using the in-memory computing capabilities. • Experience in Exporting and importing the data using Sqoop from HDFS to relational data base system. • Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture. • Involved in the Software Development Life Cycle (SDLC) phases which include Analysis, Design, Implementation, Testing and Maintenance. • Excellent problem solving and communication. • Learning and organizing skill matched with the ability to manage stress, time and people effectively. • Ability to prioritize tasks and planning the work load based on the importance. • Adaptability to any environment in a short span of time.
₹20.000 INR en 10 días
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
From past 4 years, I have been working on various components of hadoop ecosystem including Spark, Impala, hive, sqoop and various hadoop components. Yes, I can provide you the assistance required. Relevant Skills and Experience The reason for deploying spark, impala was to have a fast execution of query and jobs. Since, it's implementation we never missed the traditional MapReduce for running hive on spark jobs.
₹55.555 INR en 10 días
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
4 years of experience in Hadoop ecosystem and Spark, Scala
₹12.500 INR en 10 días
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
Currenly works in hadoop project which involves Spark with scala, hive , impala, sqoop, spark and python as hadoop developer. As this work requires no experience, i can also complete this work as experience fellow. I can work on daily basis for 2 hrs.
₹22.222 INR en 12 días
0,0 (0 comentarios)
0,0
0,0
Avatar del usuario
I have 4 years of experience on Hadoop eco systems. Have a good working experience with the cloudera flatform. Have Experience in Sqoop,Flume, Kafka, mapreduce, Pig, Hive, Habse, Cassandra Spark core, Spark SQL, can create and tranform rdds in Python API
₹27.777 INR en 10 días
0,0 (0 comentarios)
0,0
0,0

Sobre este cliente

Bandera de INDIA
Pune, India
0,0
0
Miembro desde sept 8, 2018

Verificación del cliente

Otros trabajos de este cliente

DevOps Support
₹12500-37500 INR
¡Gracias! Te hemos enviado un enlace para reclamar tu crédito gratuito.
Algo salió mal al enviar tu correo electrónico. Por favor, intenta de nuevo.
Usuarios registrados Total de empleos publicados
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Cargando visualización previa
Permiso concedido para Geolocalización.
Tu sesión de acceso ha expirado y has sido desconectado. Por favor, inica sesión nuevamente.