• email hr@bytecodesystems.com
  • Career

    Working with BCS is not a job. It's journey!

Career at BCS

Take the first step towards your dream career at BCS. Working with BCS is not a job. It's a journey! There's so much to explore and experience. Every day is a learning process with challenges to meet and goals to achieve. We offer a motivating and enjoyable work environment.

Why should you work at BCS

Take the first step towards your dream career at BCS. Working with BCS is not a job. It's a journey! There's so much to explore and experience. Every day is a learning process with challenges to meet and goals to achieve. We offer a motivating and enjoyable work environment.

Open door policy:

Our open and inclusive corporate culture welcomes you into the team immediately; irrespective of your experience.

On-the-job learning:

Intense training and development programs facilitate on-the-job learning.

Mentor programs:

Our mentor programs foster supportive relationships that help develop skills, behavior, and insights to enable you to attain your goals.

'Global Family' identity:

What sets BCS apart is the support, encouragement, and nurturing provided to you at every step.The work environment at BCS is built around the belief of growth beyond boundaries. Some of the critical elements that define our work culture are global exposure, cross-domain experience, and work-life balance.

Build your dream career

BCS is looking for talented individuals with long-term professional goals. Our customized professional development and leadership training helps you learn new skills, work across different disciplines or move into new challenges.

BCS offers competitive benefits, as well as an industry-leading practice of performance-based bonuses for all employees. We believe that global innovation demands diverse employees and attractive work/life initiatives that sustain, and retain, them. BCS gives you the power to design your workday, and your life, according to your unique styles and needs. Explore a wide-range of careers, develop industry-leading innovations, and work with the latest technologies and brilliant minds across the globe at BCS!

Current openings

Position Skills Experience Location Note
Big Data Developer • Adopt full adoption of modern software engineering and delivery practices with Design principles and patterns, Agile, Big data Spark using Scala for Hadoop development, Stateless Design Java REST/ Spring Boot based Microservices, Containerization etc.
• Manage relationships with key technology, business partners and product owners and other stake holders.
• Design and develop security policies, custom integration with IAM Systems (like Okta AD, PingID).
• Build integration with Single Page Apps, Mobile Apps, Third Party Systems using OAuth2, SSO.
• Build API, Common reusable component and automation framework for Client’s intranet access to physical services running in sandbox or test environment.
• Bring deep knowledge and experience in designing for and implementing solutions in the Cloud (AWS).
• Ensure that CI/CD pipeline covers lifecycle management needs for enterprise use cases (like covering APIs deployed in the legacy API Management environment).
• 3+ years of experience in building software enterprise software applications through Java J2EE based Technology Stack and Big data Ingestion framework and distribution framework using Hadoop, Spark and Scala.
• 2+ years’ experience working with Spark using Scala and Java on MapR Distribution on MapR FS, Hortonworks, and Big Data tools including Hive, Drill, Hue, Kafka, Solace with HBase, Mongo DB MapR-DB (Binary & JSon) No SQL databases.
• Having very strong knowledge in working with Object Oriented programming languages such as Java, Microsoft, Spring MVC, Spring Boot Microservices application.
• Strong experience in relational database concepts, SQL, and procedural languages; object-oriented design; Enterprise, distributed computing and Web-based computing methods; and design patterns.
• Must understand the concepts of SOAP and REST services as well as both XML and JSON message formats.
• Bring deep knowledge and experience in designing for and implementing solutions in the Cloud (AWS).
• Proficient in Continuous Integration (CI) and Continuous Deployment (CD) pipelines using Jenkins CI.
• Strong analytical and problem-solving skills as well as the ability to decompose complex problems and perform root cause analyses. • Work in a collaborative environment.
• Experience with various testing methodologies and strategies: Test Driven Development (TDD) implemented with JUnit, Mock objects, Stubs, Test suites, Test harness.
• Experience working with the agile team tools (GitHub, JIRA, Scrum). • Experience working with Eclipse IDE and IntelliJ IDE with Maven and Gradle build tools.
• Ability to self-organize, prioritize and handle multiple priorities without compromising on quality.
Dallas, TX. Delivery Model
- Good knowledge in Agile/ Scrum and SDLC methodologies.
- Software Engineering Practices: TestNG, Mockito, JUnit, JIRA, Splunk, Log4j, TDD.
- Dev Ops: GitHub, uDeploy, Jenkins for Continuous Integration / Continuous Deployment (CI/CD), Stash, Autosys Jobs, Maven and Gradle.

Technology aspects

- Big data & Hadoop – Spark 2.2.4, Scala 2.12, Hive, Drill, Redhat Ceph, Open IO, Scality Ring, Kafka, Solace, Hue, Zookeeper.
- Java – Core Java, J2EE, Multithreading, Java 1.8, Spring, Spring MVC, Spring boot, Spring Hibernate, Spring (JPA, Rest) Microservices Swagger API.
- Cloud Computing – AWS, S3 compatible API’s, Knowledge on Kubernetes and containerization.
- Database - SQL Server, Oracle 10g & 11g, DB2, Teradata.
- No SQL DB – Hbase, Mapr DB, Mongo DB.
Senior Software Developer Develop Cloud-based Big Data Analytic solutions – Contribute to the development of a big data platform in Azure Data Lake, AWS using pipeline technologies such as Spark, Oozie, Yarn and more to support requirements and applications. Implement and support streaming technologies using Apache Kafka and Spark streaming api’s. Implement AWS components, glue, glacier, EMR, EC2, S3, Redshift. Build scalable data pipelines on top of Hive using different file formats and data serialization formats such as Protocol Buffers, Avro, JSON. Implement machine learning solutions using software frameworks: CAFFE, Torch 7, Keras and Tensorflow. Develop components on multiple languages and analyze business and product requirements and contribute to the overall use-case - Participate in requirement, design and analysis reviews, provide input to the architecture recommendations. Implement Big data security applications and best practices of using technologies - Experience with salable applications and highly available system design and developments, with focus on speed and fault tolerance - Experience in performance tuning complex distributed spark systems. Translate complex business requirements into detailed design documents. Data Cleanse and computations of large raw data sets using Hadoop ecosystem and RDBMS technologies. Improve data pipeline performance by tuning user queries, complex query plan analysis, spark configuration settings, and optimizations. Implement test scenarios, perform unit testing and integration testing on data. Write code in Java, Scala, and Python programming languages. Develop batch and real-time applications with various data sources on Spark clusters. 8+ years software programming experience Hands-on experience in “big-data” technologies such as Hadoop, Hive, Kafka and Spark, as well as a general understanding of the fundamentals of distributed data processing (data lakes, ETL/data warehousing, dB design) 5+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. Database knowledge, proficient in SQL style queries, Cosmos Scope experience, big data platform experience would be preferred Hands on experience and strong proficiency with either Scala, Python, or Java. Experience building and optimizing data collection architectures and pipelines Cloud technologies experience/knowledge is a plus (GCP, AWS, Azure) Proficient knowledge of Apache Hadoop ecosystem. Expertise with the Linux. Good understanding of Machine learning or probability/statistics along with AI tools Self-motivated, strong sense of ownership, good teammate Dallas, TX. Senior Software Developer with Bachelor’s degree in Computer Science, Computer Information Systems, Information Technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.
Matlab Software Engineer Perform independent research and development of emerging Blow Out Preventer (BOP) technologies. · Strong understanding of Offshore RIGS and BOP application. · Strong signal processing skills · Excellent MATLAB skills. · Customer engagement with solid skills of strategical thinking, business development, and excellent execution to meet business requirements and deadlines. · Strong connection with global technical leaders, and academic researchers. · Work both independently, with other team members, and lead teams in a dynamic, and fast-paced environment. · Excellent skills in communicating with the executive management team. BS degree in Electronics & Communication Engineering or related field, or equivalent experience. · Demonstrated understanding of how end-to-end GUI Based Matlab applications. · 10+ years’ experience creating Matlab based algorithms. Demonstrating communication skills and experience working across disciplines to drive optimal solutions Dallas, TX. Educational requirement: "Bachelor's degree in computer science, computer information systems, information technology, electronics and communication engineering or a combination of education and experience equating to the U.S. equivalent of a Bachelor's Degree in one of the aforementioned subjects". Work with other Innovation team leaders and members, design the overall solution to ensure that the designs satisfy the requirements of the products. · Define, design, develop and debug real-time Matlab GUI based application system software for forward looking products and user experiences. · Developing strategy for pulling the sets of Operating data from Offshore Rigs Database i.e., OSI-PI Database & MS-SQL Database with the help of Matlab scripts. · Ability to deliver clean and efficient code (MATLAB) with organized documentation in Windows. Knowledge and hands-on experience in version control (e.g., git) and code review. · Learn constantly, dive into new areas with unfamiliar technologies. · Identify, evaluate and drive third party algorithm vendors. · Generate novel, innovative solutions to some of the toughest problems in BOP. · Testing Application code as per the Quality assurance requirement documents and Error Handling Techniques i.e., both white box and black box testing. · GIT repository tool in the development phases of the Application. · Other duties may be required from time to time.
Sr. Cognos BI Developer · Cognos BI experience 7+ years (Framework Manager/Data Modeling) · Extensive experience on multiple full life cycle Data Warehouse implementations · Extensive experience in business intelligence and OLAP · SSIS(SQL Server Integration Services) ETL Experience, as well as ETL Design and Programming · Hands-on skills in data modeling; logical, physical and dimensional modeling · Extensive experience in metadata management and master data management preferred. · Successful experience working on complex data flow and processing and with large transactional and reporting databases · Experience working with data teams, acting as a mentor in the design and development of Business Intelligence Data Environments. The candidate will have a desire for providing guidance for businesses cultivating their OLAP expertise · SSIS(SQL Server Integration Services) ETL Exposure, as well as generally extensive ETL Design and Programming · Hands-on experience with data integration tools such as Informatica, DataStage, etc. · Expertise and experience in the development and support of OLAP cubes, data mining and data analysis tools · Skilled in Data Warehouse, including logical and physical star schema data models, extraction/transformation of data · Skilled in SQL, Database and Transaction / OLTP processing concepts, knowledge of query analysis tools and performance optimization of queries · Recent Experience with leading Cognos BI tools Bachelor's Degree in Computer Science / Master's in Computers or equivalent experience. 7+ years of experience BI Database Experience Dallas, TX.
Graphic Designer Adobe Photoshop, Adobe Illustrator, CorelDRAW, Adobe Flash, Passionate Curiosity, Imagination, Objectivity & Self Awareness, Crisp Communication, Flawless Execution. 0-1 Year. Dallas, TX. ----

Send resume to:

Byte Code Systems, Inc., Attn: HR,
120 Mockingbird Ln Coppell,
TX - 75019.
Email: hr@bytecodesystems.com

arrow_drop_up