OnlyDataJobs.com

Ventula Consulting
  • Northampton, UK
  • Salary: £70k - 75k

Lead Software Engineer – Java – Global Bank – Machine Learning / Big Data, to £75k + Exceptional Package


Lead Software Engineer with a strong background in Java development required to join a new innovation focused team working on greenfield projects.


My client is working on a number of cutting edge Machine Learning and AI solutions which are set to revolutionise fraud detection and prevention so this is a great opportunity to make a real impact on the future of Banking technology.


This role would suit a highly ambitious Software Developer who is looking for a genuine challenge.


You will be joining a newly established innovation team within the Bank which consists of highly skilled technical individuals and industry thought leaders.


There is a very real opportunity for rapid career progression into both technical and management focused roles due to the high profile nature of this function.


The ideal Lead Software Engineer will have the following experience:



  • Expert in Java software Development – Java 8 or later versions

  • Experience developing Business Critical systems with low latency performance

  • Development background creating solutions using AWS

  • Any experience in Big Data, MongoDB, Spark, MySQL and React / Node would be nice to have although not a necessity


This role will be based in Northampton and offers a package of between £70-£75k + an exceptional package including Bonus, strong pension, private healthcare and a host of other benefits.

BIZX, LLC / Slashdot Media / SourceForge.net
  • San Diego, CA

Job Description (your roll):


The Senior Data  Engineer position is a challenging role that bridges the gap between data management and software development. This role reports directly to and works closely with the Director of Data Management while teaming with our software development group. You will work with the team that is designing and implementing the next generation of our internal systems replacing legacy technical debt with state-of-the-art design to enable faster product and feature creation in our big data environment.


Our Industry and Company Environment:

Candidate must have the desire to work and collagerate in a fast-paced entrepreneurial environment in the B2B technology marketing and big data space working with highly motivated co-workers in our downtown San Diego office.


Responsibilities


  • Design interfaces allowing the operations department to fully utilize large data sets
  • Implement machine learning algorithms to sort and organize large data sets
  • Participate in the research, design, and development of software tools
  • Identify, design, and implement process improvements: automating manual processes
  • Optimize data delivery, re-designing infrastructure for greater scalability
  • Analyze and interpret large data sets
  • Build reliable services for gathering & ingesting data from a wide variety of sources
  • Work with peers and stakeholders to plan approach and define success
  • Create efficient methods to clean and curate large data sets


Qualifications

    • Have a B.S., M.S. or Ph.D. in Computer Science or equivalent degree and work experience

    • Deep understanding of developing high efficiency data processing systems

    • Experience with development of applications in mission-critical environments
    • Experience with our stack:
    •      3+ years experience developing in Javascript, PHP, Symfony
    •      3+ years experience developing and implementing machine learning algorithms
    •      4+ years experience with data science tool sets
    •      3+ years MySQL
    •      Experience with ElasticSearch a plus
    •      Experience with Ceph a plus
 

About  BIZX, LLC / Slashdot Media / SourceForge.net


BIZX including its SlashDot Media division is a global leader in on-line professional technology communities such as sourceforge.net serving over 40M website viewers and serving over 150M page views each month to an enthusiastic and engaged audience of IT professionals, decision makers, developers and enthusiasts around the world. Our Passport demand generation platform leverages our huge B2B database and is considered best in class by our list of Fortune 1000 customers. Our impressive growth in the demand generation space is fueled through our use of AI, big data technologies, sophisticated systems automation - and great people.  


Location - 101 W Broadway, San Diego, CA

AXA Schweiz
  • Winterthur, Switzerland

Dich sprechen Agilität, Product driven IT, Cloud Computing und Machine Learning an?
Du bist leistungsorientiert und hast den Mut Neues auszuprobieren?

Wir haben den digitalen Wandel in unserer DNA verankert!


Dein Beitrag:



  • Das Aufgabenset umfasst vor allem Engineering (IBM MQ Linux, z/OS) und Betrieb von Middleware-Komponenten (File Transfer, Web Service Infrastruktur).

  • Im Detail heisst das Komponentenverantwortung (u.A. Lifecycling, Zur Verfügungstellung von API's und Self-Services, Automatisierung der Abläufe, Erstellung und Pflege der Dokumentation), Sicherstellung des Betriebs (Du ergreifst autonom die notwendigen Massnahmen, Bereitschaft zu sporadischen Wochenendeinsätzen/Pikett), als auch Wissenspflege und -vermittlung.

  • In einem agilen Umfeld, mithilfe bei der Migration unserer Komponenten in die Cloud.


Deine Fähigkeiten und Talente:



  • Du bringst ein abgeschlossenes Informatikstudium oder vergleichbare Erfahrung mit.

  • Dein Know-How umfasst Messaging Middleware-Komponenten, idealerweise IBM MQ auf Linux angereichert mit z/OS Knowhow, cool wären zudem noch Kenntnisse von RabbitMQ und Kafka.

  • Andere Middleware Komponenten (File Transfer und Web Service) sind Dir nicht gänzlich unbekannt und Übertragungsprotokolle als auch die Linux-Welt im Speziellen sind Dir vertraut.

  • Du bringst fundierte Erfahrung in der Automatisierung an den Tisch (Bash, Python) und auch REST, API's sowie Java(-script) sind keine Fremdwörter für Dich. Erste Programmiererfahrung in einer objektorientierten Sprache, vorzugsweise Java, runden dein Profil ab.

  • Du bist integrativ, betrachtest Herausforderungen aus verschiedenen Perspektiven und stellst unbequeme Fragen, wenn es darauf ankommt.

  • Du bist sicher in der deutschen und englischen Sprache.

American Express
  • Phoenix, AZ

Our Software Engineers not only understand how technology works, but how that technology intersects with the people who count on it every single day. Today, creative ideas, insight and new points of view are at the core of how we craft a more powerful, personal and fulfilling experience for all our customers. So if youre passionate about a career building breakthrough software and making an impact on an audience of millions, look no further.

There are hundreds of chances for you to make your mark on Technology and life at American Express. Heres just some of what youll be doing:

    • Take your place as a core member of an Agile team driving the latest application development practices.
    • Find your opportunity to execute new technologies, write code and perform unit tests, as well as working with data science, algorithms and automation processing
    • Engage your collaborative spirit by Collaborate with fellow engineers to craft and deliver recommendations to Finance, Business, and Technical users on Finance Data Management. 


Qualifications:

  

Are you up for the challenge?


    • 4+ years of Software Development experience.
    • BS or MS Degree in Computer Science, Computer Engineering, or other Technical discipline including practical experience effectively interpreting Technical and Business objectives and challenges and designing solutions.
    • Ability to effectively collaborate with Finance SMEs and partners of all levels to understand their business processes and take overall ownership of Analysis, Design, Estimation and Delivery of technical solutions for Finance business requirements and roadmaps, including a deep understanding of Finance and other LOB products and processes. Experience with regulatory reporting frameworks, is preferred.
    • Hands-on expertise with application design and software development across multiple platforms, languages, and tools: Java, Hadoop, Python, Streaming, Flink, Spark, HIVE, MapReduce, Unix, NoSQL and SQL Databases is preferred.
    • Working SQL knowledge and experience working with relational databases, query authoring (SQL), including working familiarity with a variety of databases(DB2, Oracle, SQL Server, Teradata, MySQL, HBASE, Couchbase, MemSQL).
    • Experience in architecting, designing, and building customer dashboards with data visualization tools such as Tableau using accelerator database Jethro.
    • Extensive experience in application, integration, system and regression testing, including demonstration of automation and other CI/CD efforts.
    • Experience with version control softwares like git, svn and CI/CD testing/automation experience.
    • Proficient with Scaled Agile application development methods.
    • Deals well with ambiguous/under-defined problems; Ability to think abstractly.
    • Willingness to learn new technologies and exploit them to their optimal potential, including substantiated ability to innovate and take pride in quickly deploying working software.
    • Ability to enable business capabilities through innovation is a plus.
    • Ability to get results with an emphasis on reducing time to insights and increased efficiency in delivering new Finance product capabilities into the hands of Finance constituents.
    • Focuses on the Customer and Client with effective consultative skills across a multi-functional environment.
    • Ability to communicate effectively verbally and in writing, including effective presentation skills. Strong analytical skills, problem identification and resolution.
    • Delivering business value using creative and effective approaches
    • Possesses strong business knowledge about the Finance organization, including industry standard methodologies.
    • Demonstrates a strategic/enterprise viewpoint and business insights with the ability to identify and resolve key business impediments.


Employment eligibility to work with American Express in the U.S. is required as the company will not pursue visa sponsorship for these positions.

Pyramid Consulting, Inc
  • Atlanta, GA

Job Title: Tableau Engineer

Duration: 6-12 Months+ (potential to go perm)

Location: Atlanta, GA (30328) - Onsite

Notes from Manager:

We need a data analyst who knows Tableau, scripting (JSON, Python), Altreyx API, AWS, Analytics.

Description

The Tableau Software engineer will be a key resource to work across our Software Engineering BI/Analytics stack to ensure stability, scalability, and the delivery of valuable BI & Analytics solutions for our leadership teams and business partners. Keys to this position are the ability to excel in identification of problems or analytic gaps and mapping and implementing pragmatic solutions. An excellent blend of analytical, technical and communication skills in a team based environment are essential for this role.

Tools we use: Tableau, Business Objects, AngularJS, OBIEE, Cognos, AWS, Opinion Lab, JavaScript, Python, Jaspersoft, Alteryx and R packages, Spark, Kafka, Scala, Oracle

Your Role:

·         Able to design, build, maintain & deploy complex reports in Tableau

·         Experience integrating Tableau into another application or native platforms is a plus

·         Expertise in Data Visualization including effective communication, appropriate chart types, and best practices.

·         Knowledge of best practices and experience optimizing Tableau for performance.

·         Experience reverse engineering and revising Tableau Workbooks created by other developers.

·         Understand basic statistical routines (mean, percentiles, significance, correlations) with ability to apply in data analysis

·         Able to turn ideas into creative & statistically sound decision support solutions

Education and Experience:

·         Bachelors degree in Computer Science or equivalent work experience

·         3-5 Years of hands on experience in data warehousing & BI technologies (Tableau/OBIEE/Business Objects/Cognos)

·         Three or more years of experience in developing reports in Tableau

·         Have good understanding of Tableau architecture, design, development and end user experience.

What We Look For:

·         Very proficient in working with large Databases in Oracle & Big Data technologies will be a plus.

·         Deep understanding & working experience of data warehouse and data mart concepts.

·         Understanding of Alteryx and R packages is a plus

·         Experience designing and implementing high volume data processing pipelines, using tools such as Spark and Kafka.

·         Experience with Scala, Java or Python and a working knowledge of AWS technologies such as GLUE, EMR, Kinesis and Redshift preferred.

·         Excellent knowledge with Amazon AWS technologies, with a focus on highly scalable cloud-native architectural patterns, especially EMR, Kinesis, and Redshift

·         Experience with software development tools and build systems such as Jenkins

The HT Group
  • Austin, TX

Full Stack Engineer, Java/Scala Direct Hire Austin

Do you have a track record of building both internal- and external-facing software services in a dynamic environment? Are you passionate about introducing disruptive and innovative software solutions for the shipping and logistics industry? Are you ready to deliver immediate impact with the software you create?

We are looking for Full Stack Engineers to craft, implement and deploy new features, services, platforms, and products. If you are curious, driven, and naturally explore how to build elegant and creative solutions to complex technical challenges, this may be the right fit for you. If you value a sense of community and shared commitment, youll collaborate closely with others in a full-stack role to ship software that delivers immediate and continuous business value. Are you up for the challenge?

Tech Tools:

  • Application stack runs entirely on Docker frontend and backend
  • Infrastructure is 100% Amazon Web Services and we use AWS services whenever possible. Current examples: EC2 Elastic Container Service (Docker), Kinesis, SQS, Lambda and Redshift
  • Java and Scala are the languages of choice for long-lived backend services
  • Python for tooling and data science
  • Postgres is the SQL database of choice
  • Actively migrating to a modern JavaScript-centric frontend built on Node, React/Relay, and GraphQL as some of our core UI technologies

Responsibilities:

  • Build both internal and external REST/JSON services running on our 100% Docker-based application stack or within AWS Lambda
  • Build data pipelines around event-based and streaming-based AWS services and application features
  • Write deployment, monitoring, and internal tooling to operate our software with as much efficiency as we build it
  • Share ownership of all facets of software delivery, including development, operations, and test
  • Mentor junior members of the team and coach them to be even better at what they do

Requirements:

  • Embrace the AWS + DevOps philosophy and believe this is an innovative approach to creating and deploying products and technical solutions that require software engineers to be truly full-stack
  • Have high-quality standards, pay attention to details, and love writing beautiful, well-designed and tested code that can stand the test of time
  • Have built high-quality software, solved technical problems at scale and believe in shipping software iteratively and often
  • Proficient in and have delivered software in Java, Scala, and possibly other JVM languages
  • Developed a strong command over Computer Science fundamentals
GrubHub Seamless
  • New York, NY

Got a taste for something new?

We’re Grubhub, the nation’s leading online and mobile food ordering company. Since 2004 we’ve been connecting hungry diners to the local restaurants they love. We’re moving eating forward with no signs of slowing down.

With more than 90,000 restaurants and over 15.6 million diners across 1,700 U.S. cities and London, we’re delivering like never before. Incredible tech is our bread and butter, but amazing people are our secret ingredient. Rigorously analytical and customer-obsessed, our employees develop the fresh ideas and brilliant programs that keep our brands going and growing.

Long story short, keeping our people happy, challenged and well-fed is priority one. Interested? Let’s talk. We’re eager to show you what we bring to the table.

About the Opportunity: 

Senior Site Reliability Engineers are embedded in Big Data specific Dev teams to focus on the operational aspects of our services, and our SREs run their respective products and services from conception to continuous operation.  We're looking for engineers who want to be a part of developing infrastructure software, maintaining it and scaling. If you enjoy focusing on reliability, performance, capacity planning, and the automation everything, you’d probably like this position.





Some Challenges You’ll Tackle





TOOLS OUR SRE TEAM WORKS WITH:



  • Python – our primary infrastructure language

  • Cassandra

  • Docker (in production!)

  • Splunk, Spark, Hadoop, and PrestoDB

  • AWS

  • Python and Fabric for automation and our CD pipeline

  • Jenkins for builds and task execution

  • Linux (CentOS and Ubuntu)

  • DataDog for metrics and alerting

  • Puppet





You Should Have






  • Experience in AWS services like Kinesis, IAM, EMR, Redshift, and S3

  • Experience managing Linux systems

  • Configuration management tool experiences like Puppet, Chef, or Ansible

  • Continuous integration, testing, and deployment using Git, Jenkins, Jenkins DSL

  • Exceptional communication and troubleshooting skills.


NICE TO HAVE:



  • Python or Java / Scala development experience

  • Bonus points for deploying/operating large-ish Hadoop clusters in AWS/GCP and use of EMR, DC/OS, Dataproc.

  • Experience in Streaming data platforms, (Spark streaming, Kafka)

  • Experience developing solutions leveraging Docker

Avaloq Evolution AG
  • Zürich, Switzerland

The position


Are you passionate about data? Are you interested in shaping the next generation of data science driven products for the financial industry? Do you enjoy working in an agile environment involving multiple stakeholders?

A challenging role as Senior Data Scientist in a demanding, dynamic and international software company using the latest innovations in predictive analytics and visualization techniques. You will be driving the creation of statistical and machine learning models from prototyping until the final deployment.

We want you to help us to strengthen and further develop the transformation of Avaloq to a data driven product company. Make analytics scalable and accelerate the process of data science innovation.





Your profile


  • PhD or Master degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field

  • 5+ years of experience in Statistical Modelling, Anomaly Detection, Machine Learning algorithms both Supervised and Unsupervised

  • Proven experience in applying data science methods to business problems

  • Ability to explain complex analytical concepts to people from other fields

  • Proficiency in at least one of the following: Python, R, Java/Scala, SQL and/or SAS

  • Knowledgeable with BigData technologies and architectures (e.g. Hadoop, Spark, stream processing)

  • Expertise in text mining and natural language processing is a strong plus

  • Familiarity with network analysis and/or graph databases is a plus

  • High integrity, responsibility and confidentiality a requirement for dealing with sensitive data

  • Strong presentation and communication skills

  • Experience in leading teams and mentoring others

  • Good planning and organisational skills

  • Collaborative mindset to sharing ideas and finding solutions

  • Experience in the financial industry is a strong plus

  • Fluent in English; German, Italian and French a plus



Professional requirements




  • Use machine learning tools and statistical techniques to produce solutions for customer demands and complex problems

  • Participate in pre-sales and pre-project analysis to develop prototypes and proof-of-concepts

  • Analyse customer behaviour and needs enabling customer-centric product development

  • Liaise and coordinate with internal infrastructure and architecture team regarding setting up and running a BigData & Analytics platform

  • Strengthen data science within Avaloq and establish a data science centre of expertise

  • Look for opportunities to use insights/datasets/code/models across other functions in Avaloq



Main place of work
Zurich

Contact
Avaloq Evolution AG
Alina Tauscher, Talent Acquisition Professional
Allmendstrasse 140 - 8027 Zürich - Switzerland

careers@avaloq.com
www.avaloq.com/en/open-positions

Please only apply online.

Note to Agencies: All unsolicited résumés will be considered direct applicants and no referral fee will be acknowledged.
Avaloq Evolution AG
  • Zürich, Switzerland

The position


Are you passionate about data architecture? Are you interested in shaping the next generation of data science driven products for the financial industry? Do you enjoy working in an agile environment involving multiple stakeholders?

Responsible for selecting appropriate technologies from open source, commercial on-premises and cloud-based offerings. Integrating a new generation of tools within the existing environment to ensure access to accurate and current data. Consider not only the functional requirements, but also the non-functional attributes of platform quality such as security, usability, and stability.

We want you to help us to strengthen and further develop the transformation of Avaloq to a data driven product company. Make analytics scalable and accelerate the process of data science innovation.


Your profile


  • PhD, Master or Bachelor degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field

  • Knowledgeable with BigData technologies and architectures (e.g. Hadoop, Spark, data lakes, stream processing)

  • Practical experience with Container Platforms (OpenShift) and/or containerization software (Kubernetes, Dockers)

  • Hands-on experience developing data extraction and transformation pipelines (ETL process)

  • Expert knowledge in RDBMS, NoSQL and Data Warehousing

  • Familiar with information retrieval software such as Elastic Search/Lucene/SOLR

  • Firm understanding of major programming/scripting languages like Java/Scala, Linux, PHP, Python and/or R

  • High integrity, responsibility and confidentiality a requirement for dealing with sensitive data

  • Strong presentation and communication skills

  • Good planning and organisational skills

  • Collaborative mindset to sharing ideas and finding solutions

  • Fluent in English; German, Italian and French a plus





 Professional requirements


  • Be a thought leader for best practice how to develop and deploy data science products & services

  • Provide an infrastructure to make data driven insights scalable and agile

  • Liaise and coordinate with stakeholders regarding setting up and running a BigData and analytics platform

  • Lead the evaluation of business and technical requirements

  • Support data-driven activities and a data-driven mindset where needed



Main place of work
Zurich

Contact
Avaloq Evolution AG
Anna Drozdowska, Talent Acquisition Professional
Allmendstrasse 140 - 8027 Zürich - Switzerland

www.avaloq.com/en/open-positions

Please only apply online.

Note to Agencies: All unsolicited résumés will be considered direct applicants and no referral fee will be acknowledged.
Accenture
  • Miami, FL
    Pleas
  • e Note: this role requires all employees to be local to the Nashville, TN area. If you do not live there currently, relocation would be required for consideration.***Join
    Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.Peopl
    e in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.Softw
    are Engineering professionals work across the Service Delivery Lifecycle to analyze, design, build, test, implement and/or maintain multiple system components or applications for Accenture or our clients.The R
    PA Application Development Sr. Analyst is responsible for coding the automation process components as per the low level technical design document created by the technical leads/senior developers. The senor developer will also validate the automation by performing appropriate unit testing and ensure configuration control is maintained at all times. The senior developer will mentor junior developers and perform QC checks on code components developed by them.Respo
    nsibilities Include, But May Not Be Restricted ToWork
    • closely with Technical Lead for understanding the functional and technical designDevel
    • ops and configures automation processes as per the technical design document (TDD) to meet the defined requirements. Works on the coding the more complicated automations or reusable components.Devel
    • ops new processes/tasks/objects using core workflow principles that are efficient, well structured, maintainable and easy to understand.Compl
    • ies with and helps to enforce design and coding standards, policies and procedures.Ensur
    • es documentation is well maintained.Ensur
    • es quality of coded components by performing thorough unit testing.Works
    • collaboratively with test teams during the Product test and UAT phases to fix assigned bugs with quality.Repor
    • ts status, issues and risks to tech leads on a regular basisImpro
    • ves skills in automation products by completing automation certification.Mento
    • rs junior developers and performs code reviews for quality control.

Basic Qualifications
    • Must be local to the Nashville, TN area for consideration; hiring in Nashville, TN only
    • Previous professional experience with at least one RPA tool (Blue Prism, UiPath, Automation Anywhere, etc.) or a minimum of one year of professional development in either .Net or Java
    • A minimum of 1 year in a professional role; this is not an entry-level position
Preferred Qualifications
    • A Bachelors degree or relevant work experience
    • Experience with Robotic Automation Process using products like any of the following: Blue Prism, Automation Anywhere, UIPath
    • Self-motivated, team player, action and results oriented.
    • Well organized, good communication and reporting skills.
    • Working experience in coding on at least 2 projects.
Professional Skill Requirements
    • Professional experience with Microsoft Office and Microsoft Visual Studio
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
All of our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
About Accenture
Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With more than 435,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com.
Accenture
  • Atlanta, GA
    Pleas
  • e Note: this role requires all employees to be local to the Nashville, TN area. If you do not live there currently, relocation would be required for consideration.***Join
    Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.Peopl
    e in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.Softw
    are Engineering professionals work across the Service Delivery Lifecycle to analyze, design, build, test, implement and/or maintain multiple system components or applications for Accenture or our clients.The R
    PA Application Development Sr. Analyst is responsible for coding the automation process components as per the low level technical design document created by the technical leads/senior developers. The senor developer will also validate the automation by performing appropriate unit testing and ensure configuration control is maintained at all times. The senior developer will mentor junior developers and perform QC checks on code components developed by them.Respo
    nsibilities Include, But May Not Be Restricted ToWork
    • closely with Technical Lead for understanding the functional and technical designDevel
    • ops and configures automation processes as per the technical design document (TDD) to meet the defined requirements. Works on the coding the more complicated automations or reusable components.Devel
    • ops new processes/tasks/objects using core workflow principles that are efficient, well structured, maintainable and easy to understand.Compl
    • ies with and helps to enforce design and coding standards, policies and procedures.Ensur
    • es documentation is well maintained.Ensur
    • es quality of coded components by performing thorough unit testing.Works
    • collaboratively with test teams during the Product test and UAT phases to fix assigned bugs with quality.Repor
    • ts status, issues and risks to tech leads on a regular basisImpro
    • ves skills in automation products by completing automation certification.Mento
    • rs junior developers and performs code reviews for quality control.

Basic Qualifications
    • Must be local to the Nashville, TN area for consideration; hiring in Nashville, TN only
    • Previous professional experience with at least one RPA tool (Blue Prism, UiPath, Automation Anywhere, etc.) or a minimum of one year of professional development in either .Net or Java
    • A minimum of 1 year in a professional role; this is not an entry-level position
Preferred Qualifications
    • A Bachelors degree or relevant work experience
    • Experience with Robotic Automation Process using products like any of the following: Blue Prism, Automation Anywhere, UIPath
    • Self-motivated, team player, action and results oriented.
    • Well organized, good communication and reporting skills.
    • Working experience in coding on at least 2 projects.
Professional Skill Requirements
    • Professional experience with Microsoft Office and Microsoft Visual Studio
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
All of our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
About Accenture
Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With more than 435,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com.
SafetyCulture
  • Surry Hills, Australia
  • Salary: A$120k - 140k

The Role



  • Be an integral member on the team responsible for design, implement and maintain distributed big data capable system with high-quality components (Kafka, EMR + Spark, Akka, etc).

  • Embrace the challenge of dealing with big data on a daily basis (Kafka, RDS, Redshift, S3, Athena, Hadoop/HBase), perform data ETL, and build tools for proper data ingestion from multiple data sources.

  • Collaborate closely with data infrastructure engineers and data analysts across different teams, find bottlenecks and solve the problem

  • Design, implement and maintain the heterogeneous data processing platform to automate the execution and management of data-related jobs and pipelines

  • Implement automated data workflow in collaboration with data analysts, continue to improve, maintain and improve system in line with growth

  • Collaborate with Software Engineers on application events, and ensuring right data can be extracted

  • Contribute to resources management for computation and capacity planning

  • Diving deep into code and constantly innovating


Requirements



  • Experience with AWS data technologies (EC2, EMR, S3, Redshift, ECS, Data Pipeline, etc) and infrastructure.

  • Working knowledge in big data frameworks such as Apache Spark, Kafka, Zookeeper, Hadoop, Flink, Storm, etc

  • Rich experience with Linux and database systems

  • Experience with relational and NoSQL database, query optimization, and data modelling

  • Familiar with one or more of the following: Scala/Java, SQL, Python, Shell, Golang, R, etc

  • Experience with container technologies (Docker, k8s), Agile development, DevOps and CI tools.

  • Excellent problem-solving skills

  • Excellent verbal and written communication skills 

Accenture
  • Raleigh, NC
    Pleas
  • e Note: this role requires all employees to be local to the Nashville, TN area. If you do not live there currently, relocation would be required for consideration.***Join
    Accenture and help transform leading organizations and communities around the world. The sheer scale of our capabilities and client engagements and the way we collaborate, operate and deliver value provides an unparalleled opportunity to grow and advance. Choose Accenture and make delivering innovative work part of your extraordinary career.Peopl
    e in our Client Delivery & Operations career track drive delivery and capability excellence through the design, development and/or delivery of a solution, service, capability or offering. They grow into delivery-focused roles, and can progress within their current role, laterally or upward.Softw
    are Engineering professionals work across the Service Delivery Lifecycle to analyze, design, build, test, implement and/or maintain multiple system components or applications for Accenture or our clients.The R
    PA Application Development Sr. Analyst is responsible for coding the automation process components as per the low level technical design document created by the technical leads/senior developers. The senor developer will also validate the automation by performing appropriate unit testing and ensure configuration control is maintained at all times. The senior developer will mentor junior developers and perform QC checks on code components developed by them.Respo
    nsibilities Include, But May Not Be Restricted ToWork
    • closely with Technical Lead for understanding the functional and technical designDevel
    • ops and configures automation processes as per the technical design document (TDD) to meet the defined requirements. Works on the coding the more complicated automations or reusable components.Devel
    • ops new processes/tasks/objects using core workflow principles that are efficient, well structured, maintainable and easy to understand.Compl
    • ies with and helps to enforce design and coding standards, policies and procedures.Ensur
    • es documentation is well maintained.Ensur
    • es quality of coded components by performing thorough unit testing.Works
    • collaboratively with test teams during the Product test and UAT phases to fix assigned bugs with quality.Repor
    • ts status, issues and risks to tech leads on a regular basisImpro
    • ves skills in automation products by completing automation certification.Mento
    • rs junior developers and performs code reviews for quality control.

Basic Qualifications
    • Must be local to the Nashville, TN area for consideration; hiring in Nashville, TN only
    • Previous professional experience with at least one RPA tool (Blue Prism, UiPath, Automation Anywhere, etc.) or a minimum of one year of professional development in either .Net or Java
    • A minimum of 1 year in a professional role; this is not an entry-level position
Preferred Qualifications
    • A Bachelors degree or relevant work experience
    • Experience with Robotic Automation Process using products like any of the following: Blue Prism, Automation Anywhere, UIPath
    • Self-motivated, team player, action and results oriented.
    • Well organized, good communication and reporting skills.
    • Working experience in coding on at least 2 projects.
Professional Skill Requirements
    • Professional experience with Microsoft Office and Microsoft Visual Studio
    • Proven success in contributing to a team-oriented environment
    • Proven ability to work creatively and analytically in a problem-solving environment
    • Desire to work in an information systems environment
All of our consulting professionals receive comprehensive training covering business acumen, technical and professional skills development. You'll also have opportunities to hone your functional skills and expertise in an area of specialization. We offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work we do, and the experience it offers, provide an unbeatable platform from which to build a career.
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States and with Accenture (i.e., H1-B visa, F-1 visa (OPT), TN visa or any other non-immigrant status).
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Accenture is a Federal Contractor and an EEO and Affirmative Action Employer of Females/Minorities/Veterans/Individuals with Disabilities.
Equal Employment Opportunity
All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process.
Accenture is committed to providing veteran employment opportunities to our service men and women.
About Accenture
Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With more than 435,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com.
State Farm
  • Atlanta, GA

WHAT ARE THE DUTIES AND RESPONSIBILITIES OF THIS POSITION?

    Perfo
    • rms improved visual representation of data to allow clearer communication, viewer engagement and faster/better decision-making Inves
    • tigates, recommends, and initiates acquisition of new data resources from internal and external sources Works
    • with IT teams to support data collection, integration, and retention requirements based on business need Ident
    • ifies critical and emerging technologies that will support and extend quantitative analytic capabilities Manag
    • es work efforts which require the use of sophisticated project planning techniques Appli
    • es a wide application of complex principles, theories and concepts in a specific field to provide solutions to a wide range of difficult problems Devel
    • ops and maintains an effective network of both scientific and business contacts/knowledge obtaining relevant information and intelligence around the market and emergent opportunities Contr
    • ibutes data to State Farm's internal and external publications, write articles for leading journals and participate in academic and industry conferences
    • Collaborates with business subject matter experts to select relevant sources of information
    • Develop breadth of knowledge in programming (R, Python), Descriptive, Inferential, and Experimental Design statistics, advanced mathematics, and database functionality (SQL, Hadoop)
    • Develop expertise with multiple machine learning algorithms and data science techniques, such as exploratory data analysis, generative and discriminative predictive modeling, graph theory, recommender systems, text analytics, computer vision, deep learning, optimization and validation
    • Develop expertise with State Farm datasets, data repositories, and data movement processes
    • Assists on projects/requests and may lead specific tasks within the project scope
    • Prepares and manipulates data for use in development of statistical models
    • Develops fundamental understanding of insurance and financial services operations and uses this knowledge in decision making


Additional Details:

For over 95 years, data has been key to State Farm.  As a member of our data science team with the Enterprise Data & Analytics department under our Chief Data & Analytics Officer, you will work across the organization to solve business problems and help achieve business strategies.  You will employ sophisticated, statistical approaches and state of the art technology.  You will build and refine our tools/techniques and engage w/internal stakeholders across the organization to improve our products & services.


Implementing solutions is critical for success. You will do problem identification, solution proposal & presentation to a wide variety of management & technical audiences. This challenging career requires you to work on multiple concurrent projects in a community setting, developing yourself and others, and advancing data science both at State Farm and externally.


Skills & Professional Experience

·        Develop hypotheses, design experiments, and test feasibility of proposed actions to determine probable outcomes using a variety of tools & technologies

·        Masters, other advanced degrees, or five years experience in an analytical field such as data science quantitative marketing, statistics, operations research, management science, industrial engineering, economics, etc. or equivalent practical experience preferred.

·        Experience with SQL, Python, R, Java, SAS or MapReduce, SPARK

·        Experience with unstructured data sets: text analytics, image recognition etc.

·        Experience working w/numerous large data sets/data warehouses & ability to pull from such data sets using relevant programs & coding including files, RDBMS & Hadoop based storage systems

·        Knowledge in machine learning methods including at least one of the following: Time series analysis, Hierarchical Bayes; or learning techniques such as Decision Trees, Boosting, Random Forests.

·        Excellent communication skills and the ability to manage multiple diverse stakeholders across businesses & leadership levels.

·        Exercise sound judgment to diagnose & resolve problems within area of expertise

·        Familiarity with CI/CD development methods, Git and Docker a plus


Multiple location opportunity. Locations offered are: Atlanta, GA, Bloomington, IL, Dallas, TX and Phoenix, AZ


Remote work option is not available.


There is no sponsorship for an employment visa for the position at this time.


Competencies desired:
Critical Thinking
Leadership
Initiative
Resourcefulness
Relationship Building
State Farm
  • Dallas, TX

WHAT ARE THE DUTIES AND RESPONSIBILITIES OF THIS POSITION?

    Perfo
    • rms improved visual representation of data to allow clearer communication, viewer engagement and faster/better decision-making Inves
    • tigates, recommends, and initiates acquisition of new data resources from internal and external sources Works
    • with IT teams to support data collection, integration, and retention requirements based on business need Ident
    • ifies critical and emerging technologies that will support and extend quantitative analytic capabilities Manag
    • es work efforts which require the use of sophisticated project planning techniques Appli
    • es a wide application of complex principles, theories and concepts in a specific field to provide solutions to a wide range of difficult problems Devel
    • ops and maintains an effective network of both scientific and business contacts/knowledge obtaining relevant information and intelligence around the market and emergent opportunities Contr
    • ibutes data to State Farm's internal and external publications, write articles for leading journals and participate in academic and industry conferences
    • Collaborates with business subject matter experts to select relevant sources of information
    • Develop breadth of knowledge in programming (R, Python), Descriptive, Inferential, and Experimental Design statistics, advanced mathematics, and database functionality (SQL, Hadoop)
    • Develop expertise with multiple machine learning algorithms and data science techniques, such as exploratory data analysis, generative and discriminative predictive modeling, graph theory, recommender systems, text analytics, computer vision, deep learning, optimization and validation
    • Develop expertise with State Farm datasets, data repositories, and data movement processes
    • Assists on projects/requests and may lead specific tasks within the project scope
    • Prepares and manipulates data for use in development of statistical models
    • Develops fundamental understanding of insurance and financial services operations and uses this knowledge in decision making


Additional Details:

WHAT ARE THE DUTIES AND RESPONSIBILITIES OF THIS POSITION?

    Perfo
    • rms improved visual representation of data to allow clearer communication, viewer engagement and faster/better decision-making Inves
    • tigates, recommends, and initiates acquisition of new data resources from internal and external sources Works
    • with IT teams to support data collection, integration, and retention requirements based on business need Ident
    • ifies critical and emerging technologies that will support and extend quantitative analytic capabilities Manag
    • es work efforts which require the use of sophisticated project planning techniques Appli
    • es a wide application of complex principles, theories and concepts in a specific field to provide solutions to a wide range of difficult problems Devel
    • ops and maintains an effective network of both scientific and business contacts/knowledge obtaining relevant information and intelligence around the market and emergent opportunities Contr
    • ibutes data to State Farm's internal and external publications, write articles for leading journals and participate in academic and industry conferences
    • Collaborates with business subject matter experts to select relevant sources of information
    • Develop breadth of knowledge in programming (R, Python), Descriptive, Inferential, and Experimental Design statistics, advanced mathematics, and database functionality (SQL, Hadoop)
    • Develop expertise with multiple machine learning algorithms and data science techniques, such as exploratory data analysis, generative and discriminative predictive modeling, graph theory, recommender systems, text analytics, computer vision, deep learning, optimization and validation
    • Develop expertise with State Farm datasets, data repositories, and data movement processes
    • Assists on projects/requests and may lead specific tasks within the project scope
    • Prepares and manipulates data for use in development of statistical models
    • Develops fundamental understanding of insurance and financial services operations and uses this knowledge in decision making


Additional Details:

For over 95 years, data has been key to State Farm.  As a member of our data science team with the Enterprise Data & Analytics department under our Chief Data & Analytics Officer, you will work across the organization to solve business problems and help achieve business strategies.  You will employ sophisticated, statistical approaches and state of the art technology.  You will build and refine our tools/techniques and engage w/internal stakeholders across the organization to improve our products & services.


Implementing solutions is critical for success. You will do problem identification, solution proposal & presentation to a wide variety of management & technical audiences. This challenging career requires you to work on multiple concurrent projects in a community setting, developing yourself and others, and advancing data science both at State Farm and externally.


Skills & Professional Experience

·        Develop hypotheses, design experiments, and test feasibility of proposed actions to determine probable outcomes using a variety of tools & technologies

·        Masters, other advanced degrees, or five years experience in an analytical field such as data science quantitative marketing, statistics, operations research, management science, industrial engineering, economics, etc. or equivalent practical experience preferred.

·        Experience with SQL, Python, R, Java, SAS or MapReduce, SPARK

·        Experience with unstructured data sets: text analytics, image recognition etc.

·        Experience working w/numerous large data sets/data warehouses & ability to pull from such data sets using relevant programs & coding including files, RDBMS & Hadoop based storage systems

·        Knowledge in machine learning methods including at least one of the following: Time series analysis, Hierarchical Bayes; or learning techniques such as Decision Trees, Boosting, Random Forests.

·        Excellent communication skills and the ability to manage multiple diverse stakeholders across businesses & leadership levels.

·        Exercise sound judgment to diagnose & resolve problems within area of expertise

·        Familiarity with CI/CD development methods, Git and Docker a plus


Multiple location opportunity. Locations offered are: Atlanta, GA, Bloomington, IL, Dallas, TX and Phoenix, AZ


Remote work option is not available.


There is no sponsorship for an employment visa for the position at this time.


Competencies desired:
Critical Thinking
Leadership
Initiative
Resourcefulness
Relationship Building
Riccione Resources
  • Dallas, TX

Sr. Data Engineer Hadoop, Spark, Data Pipelines, Growing Company

One of our clients is looking for a Sr. Data Engineer in the Fort Worth, TX area! Build your data expertise with projects centering on large Data Warehouses and new data models! Think outside the box to solve challenging problems! Thrive in the variety of technologies you will use in this role!

Why should I apply here?

    • Culture built on creativity and respect for engineering expertise
    • Nominated as one of the Best Places to Work in DFW
    • Entrepreneurial environment, growing portfolio and revenue stream
    • One of the fastest growing mid-size tech companies in DFW
    • Executive management with past successes in building firms
    • Leader of its technology niche, setting the standards
    • A robust, fast-paced work environment
    • Great technical challenges for top-notch engineers
    • Potential for career growth, emphasis on work/life balance
    • A remodeled office with a bistro, lounge, and foosball

What will I be doing?

    • Building data expertise and owning data quality for the transfer pipelines that you create to transform and move data to the companys large Data Warehouse
    • Architecting, constructing, and launching new data models that provide intuitive analytics to customers
    • Designing and developing new systems and tools to enable clients to optimize and track advertising campaigns
    • Using your expert skills across a number of platforms and tools such as Ruby, SQL, Linux shell scripting, Git, and Chef
    • Working across multiple teams in high visibility roles and owning the solution end-to-end
    • Providing support for existing production systems
    • Broadly influencing the companys clients and internal analysts

What skills/experiences do I need?

    • B.S. or M.S. degree in Computer Science or a related technical field
    • 5+ years of experience working with Hadoop and Spark
    • 5+ years of experience with Python or Ruby development
    • 5+ years of experience with efficient SQL (Postgres, Vertica, Oracle, etc.)
    • 5+ years of experience building and supporting applications on Linux-based systems
    • Background in engineering Spark data pipelines
    • Understanding of distributed systems

What will make my résumé stand out?

    • Ability to customize an ETL or ELT
    • Experience building an actual data warehouse schema

Location: Fort Worth, TX

Citizenship: U.S. citizens and those authorized to work in the U.S. are encouraged to apply. This company is currently unable to provide sponsorship (e.g., H1B).

Salary: 115 130k + 401k Match

---------------------------------------------------


~SW1317~

Gravity IT Resources
  • Miami, FL

Overview of Position:

We undertaking an ambitious digital transformation across Sales, Service, Marketing, and eCommerce. We are looking for a web data analytics wizard with prior experience in digital data preparation, discovery, and predictive analytics.

The data scientist/web analyst will work with external partners, digital business partners, enterprise analytics, and technology team to strategically plan and develop datasets, measure web analytics, and execute on predictive and prescriptive use cases. The role demands the ability to (1) Learn quickly (2) Work in a fast-paced, team-driven environment (3) Manage multiple efforts simultaneously (4) Adept at using large datasets and using models to test effectiveness of different courses of action (5) Promote data driven decision making throughout the organization (6) Define and measure success of capabilities we provide the organization.


Primary Duties and Responsibilities

    Analy
    • ze data captured through Google Analytics and develop meaningful actionable insights on digital behavior. Put t
    • ogether a customer 360 data frame by connecting CRM Sales, Service, Marketing cloud data with Commerce Web behavior data and wrangle the data into a usable form. Use p
    • redictive modelling to increase and optimize customer experiences across online & offline channels. Evalu
    • ate customer experience and conversions to provide insights & tactical recommendations for web optimization
    • Execute on digital predictive use cases and collaborate with enterprise analytics team to ensure use of best tools and methodologies.
    • Lead support for enterprise voice of customer feedback analytics.
    • Enhance and maintain digital data library and definitions.

Minimum Qualifications

  • Bachelors degree in Statistics, Computer Science, Marketing, Engineering or equivalent
  • 3 years or more of working experience in building predictive models.
  • Experience in Google Analytics or similar web behavior tracking tools is required.
  • Experience in R is a must with working knowledge of connecting to multiple data sources such as amazon redshift, salesforce, google analytics, etc.
  • Working knowledge in machine learning algorithms such as Random Forest, K-means, Apriori, Support Vector machine, etc.
  • Experience in A/B testing or multivariate testing.
  • Experience in media tracking tags and pixels, UTM, and custom tracking methods.
  • Microsoft Office Excel & PPT (advanced).

Preferred Qualifications

  • Masters degree in statistics or equivalent.
  • Google Analytics 360 experience/certification.
  • SQL workbench, Postgres.
  • Alteryx experience is a plus.
  • Tableau experience is a plus.
  • Experience in HTML, JavaScript.
  • Experience in SAP analytics cloud or SAP desktop predictive tool is a plus
Signify Health
  • Dallas, TX

Position Overview:

Signify Health is looking for a savvy Data Engineer to join our growing team of deep learning specialists. This position would be responsible for evolving and optimizing data and data pipeline architectures, as well as, optimizing data flow and collection for cross-functional teams. The Data Engineer will support software developers, database architects, data analysts, and data scientists. The ideal candidate would be self-directed, passionate about optimizing data, and comfortable supporting the Data Wrangling needs of multiple teams, systems and products.

If you enjoy providing expert level IT technical services, including the direction, evaluation, selection, configuration, implementation, and integration of new and existing technologies and tools while working closely with IT team members, data scientists, and data engineers to build our next generation of AI-driven solutions, we will give you the opportunity to grow personally and professionally in a dynamic environment. Our projects are built on cooperation and teamwork, and you will find yourself working together with other talented, passionate and dedicated team member, all working towards a shared goal.

Essential Job Responsibilities:

  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing data models for greater scalability, etc.
  • Leverage Azure for extraction, transformation, and loading of data from a wide variety of data sources in support of AI/ML Initiatives
  • Design and implement high performance data pipelines for distributed systems and data analytics for deep learning teams
  • Create tool-chains for analytics and data scientist team members that assist them in building and optimizing AI workflows
  • Work with data and machine learning experts to strive for greater functionality in our data and model life cycle management capabilities
  • Communicate results and ideas to key decision makers in a concise manner
  • Comply with applicable legal requirements, standards, policies and procedures including, but not limited to the Compliance requirements and HIPAA.


Qualifications:Education/Licensing Requirements:
  • High school diploma or equivalent.
  • Bachelors degree in Computer Science, Electrical Engineer, Statistics, Informatics, Information Systems, or another quantitative field. or related field or equivalent work experience.


Experience Requirements:
  • 5+ years of experience in a Data Engineer role.
  • Experience using the following software/tools preferred:
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with AWS or Azure cloud services.
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C#, etc.
  • Strong work ethic, able to work both collaboratively, and independently without a lot of direct supervision, and solid problem-solving skills
  • Must have strong communication skills (written and verbal), and possess good one-on-one interpersonal skills.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable big data data stores.
  • 2 years of experience in data modeling, ETL development, and Data warehousing
 

Essential Skills:

  • Fluently speak, read, and write English
  • Fantastic motivator and leader of teams with a demonstrated track record of mentoring and developing staff members
  • Strong point of view on who to hire and why
  • Passion for solving complex system and data challenges and desire to thrive in a constantly innovating and changing environment
  • Excellent interpersonal skills, including teamwork and negotiation
  • Excellent leadership skills
  • Superior analytical abilities, problem solving skills, technical judgment, risk assessment abilities and negotiation skills
  • Proven ability to prioritize and multi-task
  • Advanced skills in MS Office

Essential Values:

  • In Leadership Do whats right, even if its tough
  • In Collaboration Leverage our collective genius, be a team
  • In Transparency Be real
  • In Accountability Recognize that if it is to be, its up to me
  • In Passion Show commitment in heart and mind
  • In Advocacy Earn trust and business
  • In Quality Ensure what we do, we do well
Working Conditions:
  • Fast-paced environment
  • Requires working at a desk and use of a telephone and computer
  • Normal sight and hearing ability
  • Use office equipment and machinery effectively
  • Ability to ambulate to various parts of the building
  • Ability to bend, stoop
  • Work effectively with frequent interruptions
  • May require occasional overtime to meet project deadlines
  • Lifting requirements of
Sentek Global
  • San Diego, CA

Sentek Global is seeking a Software Engineer to provide support to PMW 150 in San Diego, CA!


Responsibilities
  • Design, build and maintain software, develop software infrastructure and development environments, and transition older products and capabilities to the new architectures.
  • Produce effective and powerful solutions to complex problems in areas such assoftware engineering, data analytics, automation,and cybersecurity.
  • Perform analysis of existing and emerging operational and functional requirements to support the current and future systems capabilities and requirements.
  • Provide technical expertise, guidance, architecture, development and support in many different technologies directly to government customers.
  • Perform schedule planning and program management tasks as required.
  • Perform Risk Analysis for implementation of program requirements.
  • Assist in the development of requirements documents.
  • Other duties as required.


Qualifications
  • A current active secret clearance is required to be considered for this role.
  • A Bachelors Degree in data science, data analytics, computer science, or a related technical discipline is required.
  • Three to five (3-5) years providing software engineering support to a DoD program office.
  • Experience working with data rich problems through research or programs.
  • Experience with computer programming or user experience/user interface.
  • Demonstrated knowledge completing projects with large or incomplete data and ability to recommend solutions.
  • Experience with Machine Learning algorithms including convolutional neural networks (CNN), regression, classification, clustering, etc.
  • Experience using deep learning frameworks (preferably TensorFlow).
  • Experience designing and developing professional software using Linux, Python, C++, JAVA, etc.
    • Experience applying Deep/Machine Learning technology to solve real-world problems:
    • Selecting features, building and optimizing classifiers using machine learning techniques.
    • Data mining using state-of-the-art methods.
    • Extending companys data with third party sources of information when needed.
    • Enhancing data collection procedures to include information that is relevant for building analytic systems.
  • Experience processing, cleansing, and verifying the integrity of data used for analysis.
  • Experience performing ad-hoc analyses and presenting results in a clear manner.
  • Experience creating automated anomaly detection systems and constant tracking of its performance.
  • Must be able to travel one to three (1-3) times per year.