Say no to manually filling long application forms
Visit any careers page and a lightning button will pop up on any compatible page with a form
Use ChatGPT to auto-fill job forms
Ask for Referral for any job post
![](https://weekday-user-pics.s3.us-east-2.amazonaws.com/profile-images/the-arujit-pradhan.jpg)
Arujit Pradhan
Engineering at Coinbase
About
Arujit Pradhan is a skilled Software Engineer with a passion for building platforms for data. He has over 4.7 years of relevant experience in the field and is currently working at Coinbase as an Engineering professional. Arujit is proficient in computer languages such as Python, Scala, Java, C++, and Shell Scripting (Bash). He is also experienced in using programming environments such as IntelliJ, Emacs, Vim, and R Studio. Arujit has a strong background in Big Data Technologies and has hands-on experience in Flink, Zookeeper, Kafka, Spark, Hadoop, and Docker. He is also skilled in Data Analysis using R, WEKA, and Spark MLlib. Arujit has experience in Data Center Management using Kubernetes and YARN. In his previous role at Gojek, Arujit contributed to the development of Dagger, a framework for end-to-end runtime custom Java code injection in an in-house Flink-based Data Streaming Platform. He also built Smoke and Integration testing pipelines and added multiple other features. Arujit implemented a key-value-based state backend on top of Bigtable for complex Flink Streaming queries requiring up to 3 months of production data. Additionally, he built a CLI tool for all in-house Data products, using Thor to expose git-based Data platforms to customers. Arujit also incorporated multiple features to in-house Kafka-sink platforms like Redis and Influx sinks, Templatised JSON body support to HTTP sinks, and sidecar-based monitoring using Influx reporter. At Makemytrip, Arujit worked on GoMemory, a centralized real-time user profiling Database on DynamoDB and Kafka-streams to provide user details < 200ms. He built a continuous integration testing framework to ensure correctness and a lightweight web API serving user personalization data on top of it. Arujit added Avro and protobuf data support in Kafka to Redshift/S3 ETL tool. He also worked on S3 + Delta-based Data warehouse, maintained and continuously enhanced all Data infrastructures like Kafka, Yarn, Airflow, etc. Arujit established a fault-tolerant CDC pipeline using Debezium. Arujit holds a Bachelor of Technology (B.Tech.) degree in Computer Science from the National Institute of Technology Silchar. He also has a Higher Secondary degree in Science from FM College. Arujit's tech stack includes Software Engineering, Kafka, Spark, Web3, Java, Big Data,
Education Overview
• nit silchar national institute of technology silchar
• fm college
Companies Overview
• coinbase
• goto group
• makemytrip
• medcords
• the social street
• cognitivescale
• indian institute of technology bhubaneswar
Experience Overview
6.7 Years
Find anyone’s contact
![Contact People](/_next/static/media/profilePage.e2495ab1.png)
Experience
No data found
Skills
Boost your visibility and stand out to employers with referrals from your LinkedIn connections.
Contact Details
Email (Verified)
aruXXXXXXXXXXXXXomMobile Number
+91XXXXXXXX86Education
No data found
Frequently asked questions
Find anyone’s contact and let Weekday reach out to them on your behalf
Start hiring nowStop manually filling job applications. Use AI to auto-apply to jobs
Look for jobs now