This position is for Morgan Stanley.
We require either Full time or Contractor for this position
Location: Mumbai Client location
Shift Timing:12:00 P.M to 9:00 P.M IST
Possible start date: March 30th, 2026.
Experience required is: 3 - 5 Years
Count: 3
Primary (Must Have) Skills:
- Strong SQL and Database knowledge (Snowflake/Hadoop).
- Strong Python/Shell script knowledge
- Good Communication skills.
Secondary (Good to have) Skills:
- TWS Schedular
- Agile Development.
Requirement:
Job Title: Application Support Analyst (Data Platform) - 3 - 5 years
Summary
- Provide L2 application/data operations support for analytics and ETL workloads. Focus on SQL-based investigation, monitoring scheduled jobs, basic Python/Shell automation, and clear stakeholder communication.
- Shift: 12:00 p.m. to 9:00 p.m. IST
- Target start date: March 30, 2026
Key Responsibilities
- Monitor and support daily data pipelines and batch jobs across Snowflake/Hadoop environments.
- Investigate incidents and service requests using strong SQL; perform data validation, reconciliation, and root-cause triage.
- Develop and maintain lightweight Python/Shell scripts to automate health checks, log parsing, alerts, and routine recovery steps.
- Operate job schedules using TWS (IWS): monitor job streams, handle failures, perform safe restarts/reruns, manage dependencies and calendars as per runbooks.
- Own tickets end-to-end in the ITSM tool (e.g., ServiceNow/Jira): classify, prioritize, communicate status, and drive to closure with vendors and internal teams.
- Perform environment and application health checks, create/maintain runbooks and knowledge articles, and contribute to problem management and RCA documentation.
- Collaborate with L3 engineers, data engineers, and product owners; participate in Agile ceremonies and release/smoke validations as needed.
- Produce daily shift handovers and weekly operational reports.
Must-Have Skills and Experience
- 1-3 years in application support, data operations, or ETL/analytics platform support.
- Strong SQL and database skills, including writing complex queries and troubleshooting performance on at least one platform; exposure to Snowflake and/or Hadoop (Hive/HDFS) preferred.
- Strong scripting with Python and Shell on Linux/Unix (file I/O, error handling, log parsing, simple APIs, scheduling fundamentals).
- Solid understanding of batch processing, data warehousing/ETL concepts, and incident management practices.
- Good communication skills: clear written updates, user/vendor coordination, and concise handovers.
- Familiarity with version control (Git) and working in ticketing systems (ServiceNow/Jira).
Good-to-Have Skills
- TWS/IWS (IBM Workload Scheduler) hands