Description & Requirements

Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock – from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes.


Our Team:

Our team is responsible for acquiring and maintaining the full life-cycle of Equity Corporate Actions data which includes corporate actions, IPO and M&A. In order to create a comprehensive client experience across the product offering (including but not limited to Enterprise Data, Indices, News, Terminal, AIM). Multi-functional collaboration, deep domain knowledge, thoughtful automation, and data management expertise are paramount for our ability to continuously deliver high quality data to our rapidly growing client base. Our data is a key building block in Bloomberg's overall offering, being used by more than a third of Bloomberg's 320,000 users.


What's the Role?

We are looking for a highly motivated individual with a passion for finance, data, and technology to increase value from our data product. In this role you will be responsible for developing strategies to optimize the value of Equity Corporate Actions data for our clients and improve data operations. You will be diving deep into complex datasets - requiring you to understand the data requirements, specifying the modeling needs of datasets, using existing tech stack for efficient data ingestion workflows, and data pipelining. You will implement technical solutions using programming, machine learning, AI, and human-in-the-loop approaches to make sure our data is fit-for-purpose for our clients.


We'll Trust You To:
  • Build and maintain robust and scalable data pipelines to support the ingestion, transformation, and loading of vast amounts of data from various sources using Bloomberg tech stack components such as Bloomberg Data Services, Dataflow recipes and Data Technologies Pipelines or equivalent tech stack such as Amazon S3, Amazon Web Services Lambda, Kafka, Python Pandas
  • Develop, maintain and enhance data processing workflow to support data quality strategies
  • Devise and implement data acquisition strategies for data fields from diverse and disparate, structured and unstructured sources
  • Set up business rules and visualization to measure and ensure the accuracy, timeliness, and completeness of Corporate Actions data using Bloomberg tech stack components such as business rule engines and QlikSense
  • Design producer database structures optimized for our specific use cases
  • Analyze internal processes to find opportunities for improvement and process engineer efficient and innovative workflows using programmatic machine learning approaches
  • Use your deep understanding of Equity markets and data, including trading and analytics workflows, to create comprehensive and transparent solutions that fits the use case of our internal and external clients
  • Understand clients' and markets' needs on each Corporate Actions data field to extract and maintain it
  • Collaborate with partners in creating data manipulation frameworks and establishing standard methodologies using Bloomberg's tech stack
  • Apply your proven project management skills to ensure all technical projects are on track with right requirements
  • Partner with our Product, Technology and Data Management Lab team to ensure consistent principles are leveraged, tools are fit for purpose, and results will be measurable
  • Balance the best of technical and product knowledge to craft solutions for customers

You'll Need to Have:
  • 4+ years of experience working with Python, SQL and/or NoSQL
  • 4+ years of experience working in a data engineering role
  • Proven experience in data management and experience in building ETL pipelines
  • Proficient in using Tech Stack such as Amazon S3, Lambda function, Kafka, Apache Airflow in a production environment
  • Exceptional problem-solving skills, numerical proficiency and high attention to detail
  • Ability to work independently as well as in a distributed team environment
  • Ability to optimally communicate and present concepts and methodologies to diverse audiences
  • Demonstrated continuous career growth within an organization

We'd Love to See:
  • Data Management Association Certified Data Management Professional, Data Capability Assessment Model certification
  • Experience related to ingesting and normalizing exchange disseminated Equity Corporate Actions data

Does this sound like you? Apply if you think we're a good match. We'll get in touch to let you know what the next steps are.

Salary Range = 110000 - 170000 USD Annually + Benefits + Bonus

The referenced salary range is based on the Company's good faith belief at the time of posting. Actual compensation may vary based on factors such as geographic location, work experience, market conditions, education/training and skill level.



We offer one of the most comprehensive and generous benefits plans available and offer a range of total rewards that may include merit increases, incentive compensation, [Exempt roles only], paid holidays, paid time off, medical, dental, vision, short and long term disability benefits, 401(k) +match, life insurance, and various wellness programs, among others. The Company does not provide benefits directly to contingent workers/contractors and interns.

Is a Remote Job?
No

Bloomberg unleashes the power of information and technology to bring clarity to a complex world.

Global customers rely on us to deliver accurate, real-time business and market-moving information that...

Apply Now