Jr Developer

Noida, Uttar Pradesh1-4 yrsPermanentOn-siteINR 7 - 13 LPA

Hiring for: A 20 years+ US based IT Services and Products business.

Role: Jr Developer

Positions: 1

Experience: 1 to 4 years

Location(s): Noida

Type: On-site / Permanent

Salary: Up to INR 13 LPA

Notice Period: 30 days


JOB SUMMARY:


As a Data Developer, you will be responsible for designing, developing, and maintaining robust database systems and data pipelines. Your role will focus on ensuring the efficient storage, retrieval, and processing of data to support diverse business needs such as application development, reporting, and analytics. This includes working with large datasets and complex data models to build and optimize scalable, high-performance data solutions.


ROLES & RESPONSIBILITIES

• Design, develop, and optimize complex SQL queries and stored procedures in SQL Server to support data processing needs.

• Create and manage indexes, triggers, and SQL jobs to ensure high-performance data access and automated operations.

• Monitor and fine-tune database performance and troubleshoot query-related issues.

• Design and implement cloud-based ETL tools such as Azure Data Factory, AWS Glue, or Google Dataflow.

• Handle data extraction, transformation, and loading from various structured/unstructured data sources.

• Collaborate with big data platforms (like Spark) for processing large datasets when applicable.

• Participate actively in Agile/Scrum ceremonies, sprint planning, and daily stand-ups.

• Deliver user stories and tasks in time-bound sprints with quality and adherence to coding standards.

• Implement data quality checks, validation rules, and exception handling in ETL pipelines.

• Ensure completeness, accuracy, and consistency of data across systems.

• Integrate data from disparate sources including databases, APIs, and flat files into centralized repositories or warehouses.

• Schedule and manage ETL jobs and data pipelines using workflow orchestration tools (e.g., SQL Agent, Airflow, or cloud schedulers).

• Maintain technical documentation for stored procedures, ETL pipelines, and data flows.

• Collaborate with cross-functional teams and stakeholders with clear and timely communication.

• Analyze complex business problems and transform them into scalable data solutions.

• Take initiative as a self-starter to identify improvement areas and recommend or implement changes.


MUST HAVE

• Strong experience in SQL development, including writing efficient and optimized queries.

• Hands-on expertise in developing complex stored procedures.

• Proficient in creating and managing indexes, triggers, and SQL jobs for database performance and automation.

• Experience working with any cloud-based ETL tools on platforms such as Azure, AWS, or GCP.

• Proven ability to work within an Agile development environment, participating in sprints and iterative delivery.

• Strong analytical and problem-solving skills with a proactive, self-motivated approach.

• Effective communication skills, both written and verbal, for collaboration with technical and non-technical stakeholders.


GOOD TO HAVE

• Practical experience in programming with either Python or PySpark for data processing and transformation.

• Exposure to BI tools such as Power BI, Tableau, or Looker for reporting or validating data output.

• Familiarity with cloud-native databases like Azure SQL/AWS Redshift/ Google BigQuery/ Snowflake.

• Exposure to cloud storage tools such as Azure Blob Storage, Amazon S3, or Google Cloud Storage.

Skills

ApacheBusiness Intelligence ToolsCloud ComputingCloud Data WarehouseData PipelineETLPySparkPythonSQLSQL Server

Posted March 12, 2026