experts. The individual will be responsible for developing integration
solutions that shape and define the enterprise integration strategy for BMC
products and solutions.
The ideal candidate should have these characteristics:
* Expects, requires, and demonstrates continuous innovation.
* Self-starter that requires minimal oversight yet demonstrates consistent involvement with flexibility and good follow-through skills.
* Enthusiastic about managing challenging projects across multiple teams and locations.
* Comfortable marshaling large amounts of data to make decisions and build business cases.
* Shares BMC's passion for the customer.
* Embraces change and works well in a fluid environment.
* Ability to operate and communicate at all business levels.
* Results oriented problem solver.
* Ability to think big and to inspire passion in others.
* Understand the challenges that face enterprise organizations with integrating workflow automation solutions into enterprise environments to deliver business outcomes to their end user.
Primary Roles and Responsibilities:
* Work with internal teams and customers to understand the needs of the market and to deliver integration solutions to solve customer problems.
* Create integrations for BMC’s market-leading Control-M product, with a stress on integrations to data operations applications, data-related cloud services, and other applications and cloud services.
* Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
* Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and public cloud big data and data warehouse technologies.
* Develop, connectors and development tools that work with REST APIs, flat files, and other protocols such as EDI and data formats such as JSON, XML, CSV, and other file formats, as well as the mapping between them.
* Design and build robust data integrations within our data platform and between the engineering systems to enable touchless data processing.
* Willing to provide on-call support.
Must-have experience:
* Programming background experience with at least one of Python, Java, Ruby, Go.
* Experience with Linux OS and scripting.
* Experience with software as a service (SaaS) product development and delivery..
* Build processes supporting data transformation, data structures, metadata, dependency and workload management.
* Solid experience in enterprise application integration including SOA, BPM, APIs, Web Services (SOAP/RESTful), Micro Services, Containerization (Open Shift Containers, Kubernetes and etc.)
* Experience with using JSON, XML, XSD, and other data payload formats.
Nice to have experience for the role:
* Experience with data pipeline and workflow management tools a plus: Airflow, Prefect, Dagster, etc.
* Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL & NoSQL) as well as working familiarity with a variety of databases (SQL Server, Postgres, Oracle, Cassandra etc.)
* Previous roles may have included work as a data engineer, integrations developer, or technical presales or professional services.
* Experience building and optimizing ‘Big Data’ data pipelines, architectures and data sets using various technologies (Hadoop, HDFS, Spark, Kafka, EMR, Snowflake, Redshift, etc.)
* Middleware technologies (MuleSoft, Kafka, Lambda)
* Experience with AWS or GCP Cloud.
* Experience in DevOps stack (CI & CD) and other dependency management and build tools.
* Experience with stream-processing systems: Storm, Spark-Streaming, etc.