OnlyDataJobs.com

Huntech USA LLC
  • San Diego, CA

Great opportunity to work with the leader in semiconductor industry who unveiled the worlds first 7 nanometer PC platform, created from the ground up for the next generation of personal computing by bringing new features with thin and light designs, allowing for new form factors in the always-on, always-connected category. It features the new octa-core CPU, the fastest CPU ever designed and built, with a larger cache than previous compute platforms, faster multi-tasking and increased productivity for users, disrupting the performance expectations of current thin, light and fanless PC designs. This platform is currently sampling to customers and is expected to begin shipping in commercial devices in Q3 of 2019.


Staff Data Analyst

You will study the performance of the Global Engineering Grid/ Design workflows across engineering grid and provide insights in effective analytics in support of Grid 2.0 program. You will conduct research, design statistical studies and analyze data in support of Grid 2.0 program.  This job will challenge you to dive deep into the engineering grid/ design flow world and understand the unique challenges in operating engineering grid at a scale unrivaled in the industry.  You should have experience working in an EDA or manufacturing environment and comfortable workings in an environment where problems are not always well-defined.


Responsibilities:

  • Identify and pursue opportunities to improve the efficiency of global engineering grid and design workflows.
  • Develop systems to invest, analyze, and take automated action across real-time feeds of high volume data.
  • Research and implement new analytics approaches effective deployment of machine learning/ data modeling to solve business problems Identify patterns and trends from large, high-dimensional data sets; manipulate data into digestible and actionable reports.
  • Make business recommendations (e.g. cost-benefit, experiment analysis) with effectivepresentations of findings at multiple levels of stakeholders through visual displays of quantitative information.
  • Plan effectively to set priorities and manage projects, identify roadblocks and come up technical options.


Leverage your 8+ years of experience articulating business questions and using mathematical techniques to arrive at an answer using available data. 3 - 4 yrs advance Tableau is a must. Experience translating analysis results into business recommendations. Experience with statistical software (e.g., R, Python, MATLAB, pandas, scala) and database languages like SQL Experience with data warehousing concepts (Hadoop, mapR) and visualization tools (e.g. QlikView, Tableau, Angular, Thoughtspot). Strong business acumen,critical thinking ability, and attention to detail.


Background in data science, applied mathematics, or computational science and a history of solving difficult problems using a scientific approach with MS or BS degree in a quantitative discipline (e.g., Statistics, Applied Mathematics, Operations Research, Computer Science, Electrical Engineering) and understand how to design scientific studies. You should be familiar with the state of the art in machine learning/ data modeling/ forecasting and optimization techniques in a big data environment.



Data Analytics Software Test Engineer

As a member of the Corporate Engineering Services Group software test team, you will be responsible for testing various cutting edge data analytics products and solutions. You will be working with a dynamic engineering team to develop test plans, execute test plans, automate test cases, and troubleshoot and resolve issues.


Leverage your 1+ years of experience in the following:

  • Testing and systems validation for commercial software systems.
  • Testing of systems deployed in AWS Cloud.
  • Knowledge of SQL and databases.
  • Developing and implementing software and systems test plans.
  • Test automation development using Python or Java.
  • Strong problem solving and troubleshooting skills.
  • Experience in testing web-based and Android applications.
  • Familiar with Qualcomm QXDM and APEX tools.
  • Knowledge of software development in Python.
  • Strong written and oral communication skills
  • Working knowledge of JIRA and GitHub is preferred.


Education:

  • Required: Bachelor's, Computer Engineering and/or Computer Networks & Systems and/or Computer Science and/or Electrical Engineering
  • Preferred: Master's, Computer Engineering and/or Computer Networks & Systems and/or Computer Science and/or Electrical Engineering or equivalent experience


Interested? Please send a resume to our Founder & CEO, Raj Dadlani at raj@huntech.com and he will respond to interested candidates within 24 hours of resume receipt. We are dealing with a highly motivated hiring manager and shortlisting viable candidates by February 22, 2019.

Webtrekk GmbH
  • Berlin, Deutschland
Your responsibilities:

In this role, you will set up your full-fledged research and development team of developers and data science engineers. You will evaluate and choose appropriate technologies and develop products that are powered by Artificial Intelligence and Machine Learning



  • Fast pace development of experimental prototypes, POCs and products for our >400 customers

  • Manage fast feedback cycles, adopt learnings and feedbacks and ultimately deliver AI powered products

  • You will develop new and optimise existing components always with an eye on scalability, performance and maintenance

  • Organize and lead team planning meetings and provide advice, clarification and guidance during the execution of sprints

  • Lead your teams' technical vision and drive the design and development of new innovative products and services from the technical side

  • Lead discussions with the team and management to define best practices and approaches

  • Set goals, objectives and priorities. Mentor team members and provide guidance by regular performance reviews.




The assets you bring to the team:


  • Hands on experience in agile software development on all levels based on profound technical understanding

  • Relevant experience in managing a team of software developers in an agile environment

  • At least 3 years of hands-on experience with developing in Frontend Technologies like Angular or React

  • Knowledge of backend technologies such as Java, Python or Scala are a big plus

  • Experience with distributed systems based on RESTful services

  • DevOps mentality and practical experience with tools for build and deployment automation (like Maven, Jenkins, Ansible, Docker)

  • Team and project-oriented leader with excellent problem solving and interpersonal skills

  • Excellent communication, coaching and conflict management skills as well as a strong assertiveness

  • Strong analytical capability, discipline, commitment and enthusiasm

  • Fluent in English, German language skills are a big plus




What we offer:


  • Prospect: We are a continuously growing team with experts in the most future-oriented fields of customer intelligence. We are dealing with real big data scenarios and data from various business models and industries. Apart from interesting tasks we offer you considerable freedom for your ideas and perspectives for the development of your professional and management skills.

  • Team oriented atmosphere: Our culture embraces integrity, team work and innovation. Our employees value the friendly atmosphere that is the most powerful driver within our company.

  • Goodies: Individual trainings, company tickets, team events, table soccer, fresh fruits and a sunny roof terrace.

  • TechCulture: Work with experienced developers who share the ambition for well-written and clean code. Choose your hardware, OS and IDE. Bring in your own ideas, work with open source and have fun at product demos, hackathons and meetups.

Citizens Advice
  • London, UK
  • Salary: £40k - 45k

As a Database engineer in the DevOps team here at Citizens Advice you will help us develop and implement our data strategy. You will have the opportunity to work with both core database technologies and big data solutions.


Past


Starting from scratch, we have built a deep tech-stack with AWS services at its core. We created a new CRM system, migrated a huge amount of data to AWS Aurora PG and used AWS RDS to run some of our business critical databases.


You will have gained a solid background and in-depth knowledge of AWS RDS, SQL/Admin against DBMS's such as PostgreSql / MySQL / SQL Server, Dynamo / Aurora. You will have dealt with Data Warehousing, ETL, DB Mirroring/Replication, and DB Security Mechanisms & Techniques.


Present


We use AWS RDS including Aurora as the standard DB implementation for our applications. We parse data in S3 using Spark jobs and we are planning to implement a data lake solution in AWS.


Our tools and technologies include:



  • Postgres on AWS RDS

  • SQL Server for our Data Warehouse

  • Liquibase for managing the DW schema

  • Jenkins 2 for task automation

  • Spark / Parquet / AWS Glue for parsing raw data

  • Docker / docker-compose for local testing


You will be developing, supporting and maintaining automation tools to drive database, reporting and maintenance tasks.


As part of our internal engineering platform offering, R&D time will give you the opportunity to develop POC solutions to integrate with the rest of the business.


Future


You will seek continuous improvement and implement solutions to help Citizens Advice deliver digital products better and quicker.


You will be helping us implement a data lake solution to improve operations and to offer innovative services.


You will have dedicated investment time at Citizens Advice to learn new skills, technologies, research topics or work on tools that make this possible.

Man AHL
  • London, UK

The Role


As a Quant Platform Developer at AHL you will be building the tools, frameworks, libraries and applications which power our Quantitative Research and Systematic Trading. This includes responsibility for the continued success of “Raptor”, our in-house Quant Platform, next generation Data Engineering, and evolution of our production Trading System as we continually expand the markets and types of assets we trade, and the styles in which we trade them. Your challenges will be varied and might involve building new high performance data acquisition and processing pipelines, cluster-computing solutions, numerical algorithms, position management systems, visualisation and reporting tools, operational user interfaces, continuous build systems and other developer productivity tools.


The Team


Quant Platform Developers at AHL are all part of our broader technology team, members of a group of over sixty individuals representing eighteen nationalities. We have varied backgrounds including Computer Science, Mathematics, Physics, Engineering – even Classics - but what unifies us is a passion for technology and writing high-quality code.



Our developers are organised into small cross-functional teams, with our engineering roles broadly of two kinds: “Quant Platform Developers” otherwise known as our “Core Techs”, and “Quant Developers” which we often refer to as “Sector Techs”. We use the term “Sector Tech” because some of our teams are aligned with a particular asset class or market sector. People often rotate teams in order to learn more about our system, as well as find the position that best matches their interests.


Our Technology


Our systems are almost all running on Linux and most of our code is in Python, with the full scientific stack: numpy, scipy, pandas, scikit-learn to name a few of the libraries we use extensively. We implement the systems that require the highest data throughput in Java. For storage, we rely heavily on MongoDB and Oracle.



We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker for containerisation, OpenStack for our private cloud, Ansible for architecture automation, and HipChat for internal communication. But our technology list is never static: we constantly evaluate new tools and libraries.


Working Here


AHL has a small company, no-attitude feel. It is flat structured, open, transparent and collaborative, and you will have plenty of opportunity to grow and have enormous impact on what we do.  We are actively engaged with the broader technology community.



  • We host and sponsor London’s PyData and Machine Learning Meetups

  • We open-source some of our technology. See https://github.com/manahl

  • We regularly talk at leading industry conferences, and tweet about relevant technology and how we’re using it. See @manahltech



We’re fortunate enough to have a fantastic open-plan office overlooking the River Thames, and continually strive to make our environment a great place in which to work.



  • We organise regular social events, everything from photography through climbing, karting, wine tasting and monthly team lunches

  • We have annual away days and off-sites for the whole team

  • We have a canteen with a daily allowance for breakfast and lunch, and an on-site bar for in the evening

  • As well as PC’s and Macs, in our office you’ll also find numerous pieces of cool tech such as light cubes and 3D printers, guitars, ping-pong and table-football, and a piano.



We offer competitive compensation, a generous holiday allowance, various health and other flexible benefits. We are also committed to continuous learning and development via coaching, mentoring, regular conference attendance and sponsoring academic and professional qualifications.


Technology and Business Skills


At AHL we strive to hire only the brightest and best and most highly skilled and passionate technologists.



Essential



  • Exceptional technology skills; recognised by your peers as an expert in your domain

  • A proponent of strong collaborative software engineering techniques and methods: agile development, continuous integration, code review, unit testing, refactoring and related approaches

  • Expert knowledge in one or more programming languages, preferably Python, Java and/or C/C++

  • Proficient on Linux platforms with knowledge of various scripting languages

  • Strong knowledge of one or more relevant database technologies e.g. Oracle, MongoDB

  • Proficient with a range of open source frameworks and development tools e.g. NumPy/SciPy/Pandas, Pyramid, AngularJS, React

  • Familiarity with a variety of programming styles (e.g. OO, functional) and in-depth knowledge of design patterns.



Advantageous



  • An excellent understanding of financial markets and instruments

  • Experience of front office software and/or trading systems development e.g. in a hedge fund or investment bank

  • Expertise in building distributed systems with service-based or event-driven architectures, and concurrent processing

  • A knowledge of modern practices for data engineering and stream processing

  • An understanding of financial market data collection and processing

  • Experience of web based development and visualisation technology for portraying large and complex data sets and relationships

  • Relevant mathematical knowledge e.g. statistics, asset pricing theory, optimisation algorithms.


Personal Attributes



  • Strong academic record and a degree with high mathematical and computing content e.g. Computer Science, Mathematics, Engineering or Physics from a leading university

  • Craftsman-like approach to building software; takes pride in engineering excellence and instils these values in others

  • Demonstrable passion for technology e.g. personal projects, open-source involvement

  • Intellectually robust with a keenly analytic approach to problem solving

  • Self-organised with the ability to effectively manage time across multiple projects and with competing business demands and priorities

  • Focused on delivering value to the business with relentless efforts to improve process

  • Strong interpersonal skills; able to establish and maintain a close working relationship with quantitative researchers, traders and senior business people alike

  • Confident communicator; able to argue a point concisely and deal positively with conflicting views.

118118Money
  • Austin, TX

Seeking an individual with a keen eye for good design combined with the ability to communicate those designs through informative design artifacts. Candidates should be familiar with an Agile development process (and understand its limitations), able to mediate between product / business needs and developer architectural needs. They should be ready to get their hands dirty coding complex pieces of the overall architecture.

We are .NET Core on the backend, Angular 2 on a mobile web front-end, and native on Android and iOS. We host our code across AWS and on-premises VMs, and use various data backends (SQL Server, Oracle, Mongo).

Very important is interest in (and hopefully, experience with) modern big data pipelines and machine learning. Experience with streaming platforms feeding Apache Spark jobs that train machine learning models would be music to our ears. Financial platforms generate massive amounts of data, and re-architecting aspects of our microservices to support that will be a key responsibility.

118118 Money is a private financial services company with R&D headquartered in Austin along highway 360, in front of the Bull Creek Nature preserve. We have offices around the world, so the candidate should be open to occasional travel abroad. The atmosphere is casual, and has a startup feel. You will see your software creations deployed quickly.

Responsibilities

    • Help us to build a big data pipeline and add machine learning capability to more areas of our platform.
    • Manage code from development through deployment, including support and maintenance.
    • Perform code reviews, assist and coach more junior developers to adhere to proper design patterns.
    • Build fault-tolerant distributed systems.

Requirements

    • Expertise in .NET, C#, HTML5, CSS3, Javascript
    • Experience with some flavor of ASP.NET MVC
    • Experience with SQL Server
    • Expertise in the design of elegant and intuitive REST APIs.
    • Cloud development experience (Amazon, Azure, etc)
    • Keen understanding of security principles as they pertain to service design.
    • Expertise in object-oriented design principles.

Desired

    • Machine Learning experience
    • Mobile development experience
    • Kafka / message streaming experience
    • Apache Spark experience
    • Knowledge of the ins and outs of Docker containers
    • Experience with MongoDB
FCA Fiat Chrysler Automobiles
  • Detroit, MI

Fiat Chrysler Automobiles is looking to fill the full-time position of a Data Scientist. This position is responsible for delivering insights to the commercial functions in which FCA operates.


The Data Scientist is a role in the Business Analytics & Data Services (BA) department and reports through the CIO. They will play a pivotal role in the planning, execution  and delivery of data science and machine learning-based projects. The bulk of the work with be in areas of data exploration and preparation, data collection and integration, machine learning (ML) and statistical modelling and data pipe-lining and deployment.

The newly hired data scientist will be a key interface between the ICT Sales & Marketing team, the Business and the BA team. Candidates need to be very much self-driven, curious and creative.

Primary Responsibilities:

    • Problem Analysis and Project Management:
      • Guide and inspire the organization about the business potential and strategy of artificial intelligence (AI)/data science
      • Identify data-driven/ML business opportunities
      • Collaborate across the business to understand IT and business constraints
      • Prioritize, scope and manage data science projects and the corresponding key performance indicators (KPIs) for success
    • Data Exploration and Preparation:
      • Apply statistical analysis and visualization techniques to various data, such as hierarchical clustering, T-distributed Stochastic Neighbor Embedding (t-SNE), principal components analysis (PCA)
      • Generate and test hypotheses about the underlying mechanics of the business process.
      • Network with domain experts to better understand the business mechanics that generated the data.
    • Data Collection and Integration:
      • Understand new data sources and process pipelines. Catalog and document their use in solving business problems.
      • Create data pipelines and assets the enable more efficiency and repeatability of data science activities.
    • Data Exploration and Preparation:
      • Apply statistical analysis and visualization techniques to various data, such as hierarchical clustering, T-distributed Stochastic Neighbor Embedding (t-SNE), principal components analysis (PCA)
    • Machine Learning and Statistical Modelling:
      • Apply various ML and advanced analytics techniques to perform classification or prediction tasks
      • Integrate domain knowledge into the ML solution; for example, from an understanding of financial risk, customer journey, quality prediction, sales, marketing
      • Testing of ML models, such as cross-validation, A/B testing, bias and fairness
    • Operationalization:
      • Collaborate with ML operations (MLOps), data engineers, and IT to evaluate and implement ML deployment options
      • (Help to) integrate model performance management tools into the current business infrastructure
      • (Help to) implement champion/challenger test (A/B tests) on production systems
      • Continuously monitor execution and health of production ML models
      • Establish best practices around ML production infrastructure
    • Other Responsibilities:
      • Train other business and IT staff on basic data science principles and techniques
      • Train peers on specialist data science topics
      • Promote collaboration with the data science COE within the organization.

Basic Qualifications:

    • A bachelors  in computer science, data science, operations research, statistics, applied mathematics, or a related quantitative field [or equivalent work experience such as, economics, engineering and physics] is required. Alternate experience and education in equivalent areas such as economics, engineering or physics, is acceptable. Experience in more than one area is strongly preferred.
    • Candidates should have three to six years of relevant project experience in successfully launching, planning, executing] data science projects. Preferably in the domains of automotive or customer behavior prediction.
    • Coding knowledge and experience in several languages: for example, R, Python, SQL, Java, C++, etc.
    • Experience of working across multiple deployment environments including cloud, on-premises and hybrid, multiple operating systems and through containerization techniques such as Docker, Kubernetes, AWS Elastic Container Service, and others.
    • Experience with distributed data/computing and database tools: MapReduce, Hadoop, Hive, Kafka, MySQL, Postgres, DB2 or Greenplum, etc.
    • All candidates must be self-driven, curious and creative.
    • They must demonstrate the ability to work in diverse, cross-functional teams.
    • Should be confident, energetic self-starters, with strong moderation and communication skills.

Preferred Qualifications:

    • A master's degree or PhD in statistics, ML, computer science or the natural sciences, especially physics or any engineering disciplines or equivalent.
    • Experience in one or more of the following commercial/open-source data discovery/analysis platforms: RStudio, Spark, KNIME, RapidMiner, Alteryx, Dataiku, H2O, SAS Enterprise Miner (SAS EM) and/or SAS Visual Data Mining and Machine Learning, Microsoft AzureML, IBM Watson Studio or SPSS Modeler, Amazon SageMaker, Google Cloud ML, SAP Predictive Analytics.
    • Knowledge and experience in statistical and data mining techniques: generalized linear model (GLM)/regression, random forest, boosting, trees, text mining, hierarchical clustering, deep learning, convolutional neural network (CNN), recurrent neural network (RNN), T-distributed Stochastic Neighbor Embedding (t-SNE), graph analysis, etc.
    • A specialization in text analytics, image recognition, graph analysis or other specialized ML techniques such as deep learning, etc., is preferred.
    • Ideally, the candidates are adept in agile methodologies and well-versed in applying DevOps/MLOps methods to the construction of ML and data science pipelines.
    • Knowledge of industry standard BA tools, including Cognos, QlikView, Business Objects, and other tools that could be used for enterprise solutions
    • Should exhibit superior presentation skills, including storytelling and other techniques to guide and inspire and explain analytics capabilities and techniques to the organization.
Pyramid Consulting, Inc
  • Atlanta, GA

Job Title: Tableau Engineer

Duration: 6-12 Months+ (potential to go perm)

Location: Atlanta, GA (30328) - Onsite

Notes from Manager:

We need a data analyst who knows Tableau, scripting (JSON, Python), Altreyx API, AWS, Analytics.

Description

The Tableau Software engineer will be a key resource to work across our Software Engineering BI/Analytics stack to ensure stability, scalability, and the delivery of valuable BI & Analytics solutions for our leadership teams and business partners. Keys to this position are the ability to excel in identification of problems or analytic gaps and mapping and implementing pragmatic solutions. An excellent blend of analytical, technical and communication skills in a team based environment are essential for this role.

Tools we use: Tableau, Business Objects, AngularJS, OBIEE, Cognos, AWS, Opinion Lab, JavaScript, Python, Jaspersoft, Alteryx and R packages, Spark, Kafka, Scala, Oracle

Your Role:

·         Able to design, build, maintain & deploy complex reports in Tableau

·         Experience integrating Tableau into another application or native platforms is a plus

·         Expertise in Data Visualization including effective communication, appropriate chart types, and best practices.

·         Knowledge of best practices and experience optimizing Tableau for performance.

·         Experience reverse engineering and revising Tableau Workbooks created by other developers.

·         Understand basic statistical routines (mean, percentiles, significance, correlations) with ability to apply in data analysis

·         Able to turn ideas into creative & statistically sound decision support solutions

Education and Experience:

·         Bachelors degree in Computer Science or equivalent work experience

·         3-5 Years of hands on experience in data warehousing & BI technologies (Tableau/OBIEE/Business Objects/Cognos)

·         Three or more years of experience in developing reports in Tableau

·         Have good understanding of Tableau architecture, design, development and end user experience.

What We Look For:

·         Very proficient in working with large Databases in Oracle & Big Data technologies will be a plus.

·         Deep understanding & working experience of data warehouse and data mart concepts.

·         Understanding of Alteryx and R packages is a plus

·         Experience designing and implementing high volume data processing pipelines, using tools such as Spark and Kafka.

·         Experience with Scala, Java or Python and a working knowledge of AWS technologies such as GLUE, EMR, Kinesis and Redshift preferred.

·         Excellent knowledge with Amazon AWS technologies, with a focus on highly scalable cloud-native architectural patterns, especially EMR, Kinesis, and Redshift

·         Experience with software development tools and build systems such as Jenkins

The HT Group
  • Austin, TX

Full Stack Engineer, Java/Scala Direct Hire Austin

Do you have a track record of building both internal- and external-facing software services in a dynamic environment? Are you passionate about introducing disruptive and innovative software solutions for the shipping and logistics industry? Are you ready to deliver immediate impact with the software you create?

We are looking for Full Stack Engineers to craft, implement and deploy new features, services, platforms, and products. If you are curious, driven, and naturally explore how to build elegant and creative solutions to complex technical challenges, this may be the right fit for you. If you value a sense of community and shared commitment, youll collaborate closely with others in a full-stack role to ship software that delivers immediate and continuous business value. Are you up for the challenge?

Tech Tools:

  • Application stack runs entirely on Docker frontend and backend
  • Infrastructure is 100% Amazon Web Services and we use AWS services whenever possible. Current examples: EC2 Elastic Container Service (Docker), Kinesis, SQS, Lambda and Redshift
  • Java and Scala are the languages of choice for long-lived backend services
  • Python for tooling and data science
  • Postgres is the SQL database of choice
  • Actively migrating to a modern JavaScript-centric frontend built on Node, React/Relay, and GraphQL as some of our core UI technologies

Responsibilities:

  • Build both internal and external REST/JSON services running on our 100% Docker-based application stack or within AWS Lambda
  • Build data pipelines around event-based and streaming-based AWS services and application features
  • Write deployment, monitoring, and internal tooling to operate our software with as much efficiency as we build it
  • Share ownership of all facets of software delivery, including development, operations, and test
  • Mentor junior members of the team and coach them to be even better at what they do

Requirements:

  • Embrace the AWS + DevOps philosophy and believe this is an innovative approach to creating and deploying products and technical solutions that require software engineers to be truly full-stack
  • Have high-quality standards, pay attention to details, and love writing beautiful, well-designed and tested code that can stand the test of time
  • Have built high-quality software, solved technical problems at scale and believe in shipping software iteratively and often
  • Proficient in and have delivered software in Java, Scala, and possibly other JVM languages
  • Developed a strong command over Computer Science fundamentals
Riccione Resources
  • Dallas, TX

Sr. Data Engineer Hadoop, Spark, Data Pipelines, Growing Company

One of our clients is looking for a Sr. Data Engineer in the Fort Worth, TX area! Build your data expertise with projects centering on large Data Warehouses and new data models! Think outside the box to solve challenging problems! Thrive in the variety of technologies you will use in this role!

Why should I apply here?

    • Culture built on creativity and respect for engineering expertise
    • Nominated as one of the Best Places to Work in DFW
    • Entrepreneurial environment, growing portfolio and revenue stream
    • One of the fastest growing mid-size tech companies in DFW
    • Executive management with past successes in building firms
    • Leader of its technology niche, setting the standards
    • A robust, fast-paced work environment
    • Great technical challenges for top-notch engineers
    • Potential for career growth, emphasis on work/life balance
    • A remodeled office with a bistro, lounge, and foosball

What will I be doing?

    • Building data expertise and owning data quality for the transfer pipelines that you create to transform and move data to the companys large Data Warehouse
    • Architecting, constructing, and launching new data models that provide intuitive analytics to customers
    • Designing and developing new systems and tools to enable clients to optimize and track advertising campaigns
    • Using your expert skills across a number of platforms and tools such as Ruby, SQL, Linux shell scripting, Git, and Chef
    • Working across multiple teams in high visibility roles and owning the solution end-to-end
    • Providing support for existing production systems
    • Broadly influencing the companys clients and internal analysts

What skills/experiences do I need?

    • B.S. or M.S. degree in Computer Science or a related technical field
    • 5+ years of experience working with Hadoop and Spark
    • 5+ years of experience with Python or Ruby development
    • 5+ years of experience with efficient SQL (Postgres, Vertica, Oracle, etc.)
    • 5+ years of experience building and supporting applications on Linux-based systems
    • Background in engineering Spark data pipelines
    • Understanding of distributed systems

What will make my résumé stand out?

    • Ability to customize an ETL or ELT
    • Experience building an actual data warehouse schema

Location: Fort Worth, TX

Citizenship: U.S. citizens and those authorized to work in the U.S. are encouraged to apply. This company is currently unable to provide sponsorship (e.g., H1B).

Salary: 115 130k + 401k Match

---------------------------------------------------


~SW1317~

Gravity IT Resources
  • Miami, FL

Overview of Position:

We undertaking an ambitious digital transformation across Sales, Service, Marketing, and eCommerce. We are looking for a web data analytics wizard with prior experience in digital data preparation, discovery, and predictive analytics.

The data scientist/web analyst will work with external partners, digital business partners, enterprise analytics, and technology team to strategically plan and develop datasets, measure web analytics, and execute on predictive and prescriptive use cases. The role demands the ability to (1) Learn quickly (2) Work in a fast-paced, team-driven environment (3) Manage multiple efforts simultaneously (4) Adept at using large datasets and using models to test effectiveness of different courses of action (5) Promote data driven decision making throughout the organization (6) Define and measure success of capabilities we provide the organization.


Primary Duties and Responsibilities

    Analy
    • ze data captured through Google Analytics and develop meaningful actionable insights on digital behavior. Put t
    • ogether a customer 360 data frame by connecting CRM Sales, Service, Marketing cloud data with Commerce Web behavior data and wrangle the data into a usable form. Use p
    • redictive modelling to increase and optimize customer experiences across online & offline channels. Evalu
    • ate customer experience and conversions to provide insights & tactical recommendations for web optimization
    • Execute on digital predictive use cases and collaborate with enterprise analytics team to ensure use of best tools and methodologies.
    • Lead support for enterprise voice of customer feedback analytics.
    • Enhance and maintain digital data library and definitions.

Minimum Qualifications

  • Bachelors degree in Statistics, Computer Science, Marketing, Engineering or equivalent
  • 3 years or more of working experience in building predictive models.
  • Experience in Google Analytics or similar web behavior tracking tools is required.
  • Experience in R is a must with working knowledge of connecting to multiple data sources such as amazon redshift, salesforce, google analytics, etc.
  • Working knowledge in machine learning algorithms such as Random Forest, K-means, Apriori, Support Vector machine, etc.
  • Experience in A/B testing or multivariate testing.
  • Experience in media tracking tags and pixels, UTM, and custom tracking methods.
  • Microsoft Office Excel & PPT (advanced).

Preferred Qualifications

  • Masters degree in statistics or equivalent.
  • Google Analytics 360 experience/certification.
  • SQL workbench, Postgres.
  • Alteryx experience is a plus.
  • Tableau experience is a plus.
  • Experience in HTML, JavaScript.
  • Experience in SAP analytics cloud or SAP desktop predictive tool is a plus
Signify Health
  • Dallas, TX

Position Overview:

Signify Health is looking for a savvy Data Engineer to join our growing team of deep learning specialists. This position would be responsible for evolving and optimizing data and data pipeline architectures, as well as, optimizing data flow and collection for cross-functional teams. The Data Engineer will support software developers, database architects, data analysts, and data scientists. The ideal candidate would be self-directed, passionate about optimizing data, and comfortable supporting the Data Wrangling needs of multiple teams, systems and products.

If you enjoy providing expert level IT technical services, including the direction, evaluation, selection, configuration, implementation, and integration of new and existing technologies and tools while working closely with IT team members, data scientists, and data engineers to build our next generation of AI-driven solutions, we will give you the opportunity to grow personally and professionally in a dynamic environment. Our projects are built on cooperation and teamwork, and you will find yourself working together with other talented, passionate and dedicated team member, all working towards a shared goal.

Essential Job Responsibilities:

  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing data models for greater scalability, etc.
  • Leverage Azure for extraction, transformation, and loading of data from a wide variety of data sources in support of AI/ML Initiatives
  • Design and implement high performance data pipelines for distributed systems and data analytics for deep learning teams
  • Create tool-chains for analytics and data scientist team members that assist them in building and optimizing AI workflows
  • Work with data and machine learning experts to strive for greater functionality in our data and model life cycle management capabilities
  • Communicate results and ideas to key decision makers in a concise manner
  • Comply with applicable legal requirements, standards, policies and procedures including, but not limited to the Compliance requirements and HIPAA.


Qualifications:Education/Licensing Requirements:
  • High school diploma or equivalent.
  • Bachelors degree in Computer Science, Electrical Engineer, Statistics, Informatics, Information Systems, or another quantitative field. or related field or equivalent work experience.


Experience Requirements:
  • 5+ years of experience in a Data Engineer role.
  • Experience using the following software/tools preferred:
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with AWS or Azure cloud services.
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C#, etc.
  • Strong work ethic, able to work both collaboratively, and independently without a lot of direct supervision, and solid problem-solving skills
  • Must have strong communication skills (written and verbal), and possess good one-on-one interpersonal skills.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable big data data stores.
  • 2 years of experience in data modeling, ETL development, and Data warehousing
 

Essential Skills:

  • Fluently speak, read, and write English
  • Fantastic motivator and leader of teams with a demonstrated track record of mentoring and developing staff members
  • Strong point of view on who to hire and why
  • Passion for solving complex system and data challenges and desire to thrive in a constantly innovating and changing environment
  • Excellent interpersonal skills, including teamwork and negotiation
  • Excellent leadership skills
  • Superior analytical abilities, problem solving skills, technical judgment, risk assessment abilities and negotiation skills
  • Proven ability to prioritize and multi-task
  • Advanced skills in MS Office

Essential Values:

  • In Leadership Do whats right, even if its tough
  • In Collaboration Leverage our collective genius, be a team
  • In Transparency Be real
  • In Accountability Recognize that if it is to be, its up to me
  • In Passion Show commitment in heart and mind
  • In Advocacy Earn trust and business
  • In Quality Ensure what we do, we do well
Working Conditions:
  • Fast-paced environment
  • Requires working at a desk and use of a telephone and computer
  • Normal sight and hearing ability
  • Use office equipment and machinery effectively
  • Ability to ambulate to various parts of the building
  • Ability to bend, stoop
  • Work effectively with frequent interruptions
  • May require occasional overtime to meet project deadlines
  • Lifting requirements of
HelloFresh US
  • New York, NY

HelloFresh is hiring a Data Scientist to join our Supply Chain Analytics Team! In this exciting role, you will develop cutting edge insights using a wealth of data about our suppliers, ingredients, operations, and customers to improve the customer experience, drive operational efficiencies and build new supply chain capabilities. To succeed in this role, you’ll need to have a genuine interest in using data and analytic techniques to solve real business challenges, and a keen interest to make a big impact on a fast-growing organization.


You will...



  • Own the development and deployment of quantitative models to make routine and strategic operational decisions to plan the fulfillment of orders and identify the supply chain capabilities we need to build to continue succeeding in the business

  • Solve complex optimization problems with linear programming techniques

  • Collaborate across operational functions (e.g. supply chain planning, logistics, procurement, production, etc) to identify and prioritize projects

  • Communicate results and recommendations to stakeholders in a business oriented manner with clear guidelines which can be implemented across functions in the supply chain

  • Work with complex datasets across various platforms to perform descriptive, prescriptive, predictive, and exploratory analyses


At a minimum, you have...



  • Advanced degree in Statistics, Economics, Applied Mathematics, Computer Science, Data Science, Engineering or a related field

  • 2 - 5 years’ experience delivering analytical solutions to complex business problems

  • Knowledge of linear programming optimization techniques (familiarity with software like CPLEX, AMPL, etc is a plus)

  • Fluency in managing and analyzing large data sets of data with advanced tools, such as R and Python etc.

  • Experience extracting and transforming data from structured databases such as: MySQL, PostgreSQL, etc.


You are...



  • Results-oriented - You love transforming data into meaningful outcomes

  • Gritty - When you encounter obstacles you find solutions, not excuses

  • Intellectually curious – You love to understand why things are the way they are, how things work, and challenge the status quo

  • A team player – You favor team victories over individual success

  • A structured problem solver – You possess strong organizational skills and consistently demonstrate a methodical approach to all your work

  • Agile – You thrive in fast-paced and dynamic environments and are comfortable working autonomously

  • A critical thinker – You use logic to identify opportunities, evaluate alternatives, and synthesize and present critical information to solve complex problems



Our team is diverse, high-performing and international, helping us to create a truly inspiring work environment in which you will thrive!


It is the policy of HelloFresh not to discriminate against any employee or applicant for employment because of race, color, religion, sex, sexual orientation, gender identity, national origin, age, marital status, genetic information, disability or because he or she is a protected veteran.

Manhattan Associates, Inc.
  • Atlanta, GA
Manhattan designs, builds and delivers market-leading Supply Chain Commerce solutions for its customers around the world. We help drive the commerce revolution with unmatched insight and unrivaled technology, connecting front-end revenue and relationships with back-end execution and efficiencyoptimized on a common technology platform. This platform-based approach is enabling leading companies across the globe to get closer to their customers and achieve real-world results.
What Drives Us
We have 3 core philosophies that make up what we call the Manhattan Spirit. These philosophies call on each of us to Focus on the Customer, Seize Every Opportunity, and Never Settle. With these concepts as guides, we have become the most sought-after commerce supply chain solution for companies all over the globe. Our teams are diverse, intelligent, collaborative, and fun! At Manhattan Associates, your role would be unique and importantas all our employees are essential to building success with our clients and our reputation as a gold-standard in Supply Chain solutions. Named consistently as a Leader in the Gartner Magic Quadrant, Manhattan Associates is further expanding its leadership capabilities by investing in easy to deploy, extensible and highly scalable cloud-based solutions.
Where You Will Work
Our R&D team is the heart and soul of Manhattan Associates product portfolio. They design the future of our products ahead of the curve both technically and operationally over our competitors. You will be floating in the cloud, drinking your fair share of Java, and getting creative daily. You will have the opportunity to learn and interact with people from a variety of backgrounds and skill sets to enhance your technical knowledge while on a path for career growth internally at Manhattan Associates.
Whats the Big Picture?
Some of the important things you will do in this role:
  • Provide technical leadership to a small group of software engineers.
  • Develop detailed design specifications for multiple areas of a software system (or database); responsible for the design and implementation of complex frameworks and toolkits to be used across multiple products (SCPP/CBO).
  • Become familiar with all dependencies, interfaces and services required by most areas of the software system.
  • Determine optimal and efficient designs for multiple areas of the software system.
  • Lead development of multiple areas of the software system from detailed design specifications.
  • Develop and unit test assigned software areas following R&D development processes.
  • Estimate and plan own work and the work of others in the group.
  • Become the resident expert across multiple areas of the system.
  • Evaluate software (or configuration) issues with many areas of the system and resolves them in a timely manner.
  • Implement changes to system assuring the changes do not introduce new issues.
  • Keep abreast of improvements in software techniques and develop some improvements on your own.
  • Facilitate technology and skills knowledge transfer within the team and beyond.
  • Document all work.
  • Ensure high quality software deliverables by leveraging automation and tooling best practices.
Whats Required of Me?
    Bache
    • lors or foreign equivalent degree in computer science, engineering or a related technical field
    • 7 years of experience developing, supporting, or implementing application software
    • 5 years of experience with JSF technology
    • 5 years of experience developing UIs with HTML, JavaScript, and CSS
    • 3 years of experience developing with Spring framework
    • 5 years of experience developing enterprise-scale frameworks in Java-based UI technologies
    • 3 years of experience developing with Angular, Ionic, or other JS frameworks
What Would Make Me Stand Out?
  • Strong Angular 4+ skills, utilizing CLI
  • Experience in Ionic 3 or better
  • Experience in mobile web application development and hybrid apps using Cordova or similar
Manhattan Associates is at the forefront of the most innovative technologies, in a business casual environment that provides plenty of opportunity for growth. We pride ourselves on promoting a culture that encourages open minds, fosters superior communication and creates new possibilities. Manhattan Associates is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a veteran. In the United States, Manhattan Associates participates in the Employment Eligibility Verification Program (E-Verify) operated by the Department of Homeland Security in partnership with the Social Security Administration. Participation in the E-Verify Program allows Manhattan Associates to confirm the employment eligibility of all newly hired employees after the Employment Eligibility Verification Form (Form I-9) has been completed.
Hulu
  • Santa Monica, CA

WHAT YOU’LL DO



  • Build elegant systems that are robust and scalable

  • Challenge our team and software to be even better

  • Use a mix of technologies including Scala, Ruby, Python, and Angular JS


WHAT TO BRING



  • BS or MS in Computer Science/Engineering

  • 5+ years of relevant software engineering experience

  • Strong programming (Java/C#/C++ or other related programming languages) and scripting skills

  • Great communication, collaboration skills and a strong teamwork ethic

  • Strive for excellence


NICE-TO-HAVES



  • Experience with both statically typed languages and dynamic languages

  • Experience with relational (Oracle, MySQL) and non-relational database technologies (MongoDB, Cassandra, DynamoDB)

Computer Staff
  • Fort Worth, TX

We have been retained by our client located in Fort Worth, Texas (south Ft Worth area), to deliver a Risk Modeler on a regular full-time basis.   We prefer SAS experience but are interviewing candidates with R, SPSS, WPS, MatLab or similar statistical package experience if candidate has experience from financial loan credit risk analysis industry. Enjoy all the resources of a big company, none of problems that small companies have. This company has doubled in size in 3 years. We have a keen interest in finding a business minded statistical modeling candidate with some credit risk experience to build statistical models within the marketing, direct mail areas of financial services, lending, loans. We are seeking a candidate with statistical modeling, and data analysis skills, interested in creating better ways to solve problems in order to increase loan originations, and decrease loan defaults, and more. Our client is in business to find prospective borrowers, originate loans, provide loans, service loans, process loans and collect loan payments. The team works with third party data vendors, credit reporting agencies and data service providers, data augmentation, address standardization, fraud detection; decision sciences, analytics, and this position includes create of statistical models. They support the one of the, if not the largest profile of decision management in the US.  


We require experience with statistical analysis tools such as SAS, Matlab, R, WPS or SPSS or Python if to do statistical analysis. This is a statistical modeling, risk modeling, model building, decision science, data analysis and statistical analysis type of role requiring SQL and/or SQL Server experience and critical thinking skills to solve problems.   We prefer candidates with experience with data analysis, SQL queries, joins (left, inner, outer, right), reporting from data warehouses with tools such as Tableau, COGNOS, Looker, Business Objects. We prefer candidates with financial and loan experience especially knowledge of loan originations, borrower profiles or demographics, modeling loan defaults, statistical analysis i.e. Gini Coefficients and K-S test / Kolmogorov-Smirnov test for credit scoring and default prediction and modeling.


However, primarily critical thinking skills, and statistical modeling and math / statistics skills are needed to fulfill the tasks of this very interesting and important role, including playing an important role growing your skills within this small risk/modeling team. Take on challenges in the creation and use of statistical models. There is no use for Hadoop, or any NoSQL databases in this position this is not a big data type of position. no "big data" type things needed. There is no Machine Learning or Artificial Intelligence needed in this role. Your role is to create and use those statistical models. Create statistical models for direct mail in financial lending space to reach the right customers with the right profiles / demographics / credit ratings, etc. Take credit risk, credit analysis, loan data and build a new model, or validate the existing model, or recalibrate it or rebuild it completely.   The models are focused on delivering answers to questions or solutions to problems within these areas financial loan lending: Risk Analysis, Credit Analysis, Direct Marketing, Direct Mail, and Defaults. Logistical regression in SAS or Knowledge Studio, and some light use of Looker as the B.I. tool on top of SQL Server data.   Deliver solutions or ways for this business to make improvements in these areas and help the business be more profitable. Seek answers to questions. Seek solutions to problems. Create models. Dig into the data. Explore and find opportunities to improve the business. Expected to fit within the boundaries of defaults or loan values and help drive the business with ideas to get a better models in place, or explore data sources to get better models in place. Use critical thinking to solve problems.


Answer questions or solve problems such as:

What are the statistical models needed to produce the answers to solve risk analysis and credit analysis problems?

What are customer profiles have the best demographics or credit risk for loans to send direct mail items to as direct marketing pieces?

Why are loan defaults increasing or decreasing? What is impacting the increase or decrease of loan defaults?  



Required Skills

Bachelors degree in Statistics or Finance or Economics or Management Information Systems or Math or Quantitative Business Analysis or Analytics any other related math or science or finance degree. Some loan/lending business domain work experience.

Masters degree preferred, but not required.

Critical thinking skills.

must have SQL skills (any database SQL Server, MS Access, Oracle, PostgresSQL, Postgres) and the ability to write queries, joins, inner joins, left joins, right joins, outer joins. SQL Server is highly preferred.

Any statistical analysis systems / packages experience including statistical modeling experience, and excellent math skills:   SAS, Matlab, R, WPS, SPSS or Python with R language if used in statistical analysis. Must have significant statistical modeling skills and experience.



Preferred Skills:
Loan Credit Analysis highly preferred.   SAS highly preferred.
Experience with Tableu, Cognos, Business Objects, Looker or similar data warehouse data slicing and dicing and data warehouse reporting tools.   Creating reports from data warehouse data, or data warehouse reporting. SQL Server SSAS but only to pull reports. Direct marketing, direct mail marketing, loan/lending to somewhat higher risk borrowers.



Employment Type:   Regular Full-Time

Salary Range: $85,000 130,000 / year    

Benefits:  health, medical, dental, vision only cost employee about $100 per month.
401k 4% matching after 1 year, Bonus structure, paid vacation, paid holidays, paid sick days.

Relocation assistance is an option that can be provided, for a very well qualified candidate. Local candidates are preferred.

Location: Fort Worth, Texas
(area south of downtown Fort Worth, Texas)

Immigration: US citizens and those authorized to work in the US are encouraged to apply. We are unable to sponsor H1b candidates at this time.

Please apply with resume (MS Word format preferred), and also Apply with your Resume or apply with your Linked In Profile via the buttons on the bottom of this Job Posting page:  

http://www.computerstaff.com/?jobIdDescription=314  


Please call 817-424-1411 or please send a Text to 817-601-7238 to inquire or to follow up on your resume application. Yes, we recommend you call to leave a message, or send a text with your name, at least.   Thank you for your attention and efforts.

Apporchid Inc
  • Philadelphia, PA

Java- Techcnial lead

Job description:

Experienced Java/J2EE technical lead with proven expertise in implementing, managing enterprise scale Hadoop architectures and environments. Setup highly available App Orchid Java Product platform in AWS with industry standard security frameworks. Collaborates with application developers to support, manage, enhance and tactical roadmaps to support large and highly visible Product environment deployments.

Roles and Responsibilities:

  • Work with Solution Architects and Business leaders to understand the architectural roadmaps that support and fulfill business strategy.
  • Lead and design custom solutions on our App Orchid Product Platform
  • Act as a Tech Lead and Engineer mentoring colleagues with less experience
  • Collaboration with a high-performing, forward-focused team, Product Owner(s) and Business stakeholders engagement
  • Enable and influence the timely and successful delivery of business data capabilities and/or technology objectives
  • Opportunity to expand your communication, analytical, interpersonal, and organization capabilities
  • Experience working in a fast paced environment driving business outcomes leveraging Agile to its fullest
  • Enhance your entrepreneurial mindset network opportunity and influencing outcomes
  • Supporting environment that fosters can-do attitude and opportunity for growth and advancement based on consistent demonstrative performance.
  • Expertise in system administration and programming skills. Storage, performance tuning and capacity management of Big Data.
  • Good understanding of Hadoop eco system such as HDFS, YARN, Map Reduce, HBase, Spark, and Hive.
  • Experience in setup of SSL and integration with Active Directory.
  • Good exposure to CI/CD
  • Oversee technical deliverables for invest and maintenance projects through the software development life cycle, including validating the completeness of estimates, quality and accuracy of technical designs, build and implementation.
  • Proactively address technical issues and risks that could impact project schedule and/or budget
  • Work closely with stakeholders to design and document automation solutions that align with the business needs and also consistent with the architectural vision.
  • Facilitate continuity between Sourcing Partners, other IT Groups and Enterprise Architecture.
  • Work closely with the architecture team to ensure that the technical solution designs and implementation are consistent with the architectural vision, as well as to drive the business through technical innovation through the use of newly identified and leading technologies.
  • Own and drive adoption of DevOps tools and best practices (including conducting (automated) code reviews, reducing/eliminating technical debt, and delivering vulnerability free code) across the application portfolio.

Qualifications

  • Bachelor's degree or equivalent work experience
  • Eight to Ten years (or more) experience as Java/J2EE Technical lead/Sr developer in a large production environment.
  • A deep understanding of Big Data,  Java, Elastic Search, Kibana, Postgresql, TestNG, Gradle
  • Good verbal and written communication skill
  • Demonstrated experience in working on large projects or small teams
  • Working knowledge of Red Hat Linux and Windows operating systems
  • Expert knowledge in Java programming language, SQL and microservices  
  • Good understanding of Cloud technologies, especially AWS stack
  • At least 8 years experience with developing and implementing applications

Desired Skills and Experience

  • Proficient with Java development
  • Ability to quickly learn new technologies and enable/train other analysts
  • Ability to work independently as well as in a team environment on moderate to highly complex issues
  • High technical aptitude and demonstrated progression of technical skills - continuous improvement
  • Ability to automate software/application installations and configurations hosted on Linux servers.
Delivery Hero SE
  • Berlin, Germany

Delivery Hero is building the next generation global online food-delivery platform. 


We’re a truly global team, working across 45 countries to ensure our customers are able to find, order and receive their favourite food in the fastest way possible. Since we started our journey in 2011, Delivery Hero has become the world’s largest food-delivery network, and we’re focused on a culture of growth, in both size and opportunities.  


Food delivery… sounds fancy, but how does that actually work? 


Someone is hungry, she Googles her favourite restaurant, clicks the first link… and she lands on the restaurant page on lieferheld.de. 


Now imagine this person lives in Singapore, or in Argentina, or in one of the 40 countries where Delivery Hero is present. Each country has a different culture, a different language (or even several!). Imagine more than 200k restaurants with details that make them unique. 


How would you make sure that she finds the food she wants? Can you make sure it happens quickly? Can you make sure it happens at scale and in a sustainable way?


What's on the menu?



  • To design, implement and maintain new tools for the Advertising Solutions tech team.

  • Driving innovation in marketing, with a goal to increase the effectiveness of our marketing activities, through technological solutions.

  • Improving and maintain existing applications and services.

  • You will be an active partner for Product management in order to provide the best value for stakeholders and our customers.

  • Ensuring code quality by coaching junior colleagues.

  • Taking responsibility of a number of systems/products.

  • You will make sure that the platforms keep up with high standards of scalability, stability, security and performance.

  • Our Tech Stack: python3, flask, drf, celery, pandas, docker, angular, circleci, aws/gcp.


What do you bring to the table?



  • Software development experience with a key focus on modern web applications and cloud services.

  • You like data and you are comfortable handling lots of it.

  • Passion for clean and beautiful code, with an eye for simplicity and pragmatism.

  • You firmly believe in lean and agile development.

  • Getting up to speed with new systems and concepts quickly.

  • Understanding that quality is not ensured solely by the QA engineer, but by the team.

  • You believe in cross-functional teams and know that responsibility does not end with deployment.

  • Good communication skills and you enjoy sharing knowledge among your peers.

  • Nice to have: knowledge of the Facebook and AdWords APIs, Python data manipulation libraries. Frontend development, infrastructure or/and automation knowledge are a plus.


What do we feed you with?



  • English is our working language, but our colleagues at Delivery Hero come from every corner of the globe providing an incredibly diverse, international working atmosphere with cross-cultural teams.

  • A modern, recently refurbished office in the heart of Berlin.

  • We offer flexible working hours to fit around your personal or family life in case you have to drop off your kids at kita or perhaps you just want to come in to work a little later.

  • Great career opportunities following our development career plan.

  • Being part of a global family under the Delivery Hero umbrella, we can offer you the safety of a large company including a pension scheme and stability.

  • Moving can be stressful so to help you settle in we provide a relocation package including visa help, temporary accommodation and a budget to applicants from abroad.

  • Our building has several kitchens where you’ll find fresh fruit, cereals, juice/ drinks, tea, coffee, etc.

  • On nearly every floor you’ll find a kicker or table tennis (or a lounge, and even a nap room) for when you need to take a break.

  • Being a food-ordering company, in between playing and working hard, we provide a generous number of monthly vouchers to use for ordering food on our platforms when you get the munchies.

  • Sprichst du kein Deutsch? No worries, we provide German classes for those expats who want to expedite their integration.

  • We know that experience at the office with interesting and diverse problems, colleagues and technologies isn’t the only way to learn, so our employees get an education budget to attend conferences or trainings locally or around Europe to satisfy their pursuit of knowledge.

  • Every Friday we have a team integration, when we just stop working earlier, grab a beer and relax with colleagues.

  • Sometimes we just enjoy our times together with team events and company events.


At Delivery Hero, we believe diversity and representation is key to creating not only an exciting product, but also an amazing customer and employee experience. Fostering this starts with hiring -- therefore we do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, or any other aspect that makes you, you.

Have we caught your attention?
 Then please send us your application including cover letter, CV, salary expectations and earliest starting date. We’re looking forward to your application!

GE Capital
  • Ann Arbor, MI
  • ***Please Note: This Role is in Van Buren, MI (30 minutes drive from Ann Arbor)


Role Summary

Serve as analytics & visualization developer to build innovative solutions to support a broad range of analysis and outcomes. Partner with teams to create wing-to-wing transactional views, trends and anomalies leveraging GE's data lake. Look for new ways to harness the data we have for insights and actionable outcomes. 

 
In This Role, You Will

Essential Responsibilities: 


  • Develop Spotfire reports utilizing advanced data visualization techniques and related SQL.
  • Leverage Treasury Data Lake and data virtualization technologies (Denodo) to deliver new capabilities on tablet and mobile platforms.
  • Work on an agile team using Rally to quickly prototype and iterate on ideas
  • Lead the research and evaluation of emerging technology, industry and market trends to assist in project development and/or operational support activities
  • Technical analysis working with PostgreSQL & AWS native services
  • Partner with business teams to define requirements & user stories
  • Building and implementing analytical models with R and Python


Qualifications/Requirements
  • Bachelors degree from an accredited university or college in Computer Science or Information Systems
  • One or more years experience of design & development of data centric applications leveraging data from enterprise data warehouses.


Eligibility Requirements:

  • Legal authorization to work in the U.S. is required. We will not sponsor individuals for employment visas, now or in the future, for this job


Technical Expertise

Desired Characteristics:
  • 1 year+ experience with BI visualization and/or reporting tools (Expert level knowledge of Modern BI Platforms like Spotfire, Qlik, Tableau etc.); a data and reporting guru.
  • Experience with web technologies such as ASP, HTML and CSS Integration of same with Data Visualization tools (e.g. extensions) a plus
  • Experience with scripting languages like Java Script, Python etc.
  • Exposure to advanced analytic & data science applications
  • Excellent BI application development skills, as demonstrated by having led, designed and implemented successful web and mobile projects
  • Ability to clearly articulate creative ideas to senior leaders
  • Ability to guide and direct technical team members through the SDLC
  • Ability to hit tight deadlines and work under pressure
  • SAP and/or Oracle ERP systems exposure a plus
  • Passion for learning new technologies and eagerness to collaborate with other creative minds
  • Strong desire for exploring, evaluating and understanding new technologies
X-Mode Social
  • Reston, VA
X-Mode provides real-time location data and technologies that power location intelligence for advertising and business decisions in financial services, healthcare, high-tech, real-estate, retail, and the public sector. X-Mode's flagship product is a fast-growing big data location platform, which maps daily the precise routes of 10% of the U.S. Population and maps monthly 1 in 3 adult U.S. smartphone users. X-Mode strives to produce and monetize the world’s largest location platform and ultimately create a global “living map” of 1 billion people with the highest quality location data in order to fuel the best location intelligence business solutions.

X-Mode is looking for a well-rounded full-stack engineer with web application development experience who is interested in delivering well designed, intuitive dashboards for both our internal and external users.



WHAT YOU'LL DO:




    • Develop clean, reusable, and testable code with performance in mind

    • Optimize applications for time complexity and ease of extensibility

    • Utilize REST API best practices for an easily scalable user interface

    • Collaborate with support team to identify and address bugs and help deliver a high level of customer satisfaction and directly hear feedback from end users

    • Maintain design and progress information in Atlassian Jira and Confluence

    • Ensure the technical feasibility of UI/UX designs







WHO YOU ARE:




    • Degree in Computer Science, Engineering or a related field

    • 2-5+ years of experience in full-stack web application development

    • Real world experience in RESTful and Microservices architectures, JavaScript frameworks and tools specifically React, Angular, Node

    • Experience implementing complex/ dynamic web user interfaces with HTML, CSS, Javascript/Typescript

    • Good understanding of asynchronous request handling, partial page updates, and AJAX

    • Familiar with AWS

    • Experience connecting with relational and NoSQL databases

    • Must have past experience clearly communicating to design any limitations or constraints with UX visualizing the dynamic data

    • Past experience using code versioning tools, such as Git, Bitbucket

    • Experience in Agile / scrum methodologies

    • Independent problem solver with superior technical, analytical and troubleshooting skills

    • Strong interpersonal and communication (written and oral) skills







NICE TO HAVES:




    • Experience with AWS API

    • Knowledge of cross-browser compatibility issues and ways to work around them

    • Familiarity with tools such as Photoshop/InDesign

    • Experience with Scala







WHAT WE OFFER:




    • Competitive Salary

    • Medical, Dental and Vision

    • 15 Days of PTO (Paid Time Off)

    • Lunch provided 2x a week 

    • Snacks, snacks, snacks!

    • Casual dress code

    • Free Parking on-site


Carbon Lighthouse, Inc.
  • San Francisco, CA

Target Start Date: Ongoing


Reports To: Director of Software


Location: San Francisco, CA


Carbon Lighthouse is looking for an adaptable, self-motivated, full stack Senior Software Engineer to work and grow with us to envision and build out our software platform and accelerate environmental impact. Along the way, you’ll learn all about energy efficiency, solar, real estate markets, construction, and probably get your hands dirty (or at least dusty) on site visits.

We have taken the first step of transforming our data analysis and thermodynamics modeling toolset into a robust software platform, called CLUES®, built on the latest web technologies. Now it’s time to take our platform to the next level and convert it into the tool we use to fulfill our mission and have a global impact. Your role will initially concentrate on streamlining the flow of data from wireless sensors through CLUES, implementing optimization and machine learning to drive the automation of our analytics and modeling process, and looking for ways to scale CLUES to a platform that enables us to stop climate change.

Your immediate career path will concentrate on designing and building out both the frontend and backend of CLUES, with many growth opportunities for specialization and management as our company expands. You should have a bachelor’s degree in computer science (or similar), 5+ years of professional development experience, and be excited by complex and open-ended engineering problems. You genuinely enjoy working on collaborative teams, and want to be involved in large scale challenges that require long term effort. While this role involves significant software development work, you will also spend time interacting with Carbon Lighthouse mechanical engineers, project managers, and energy performance engineers to connect our software to our real-world mission.

The role is based at our headquarters in downtown San Francisco, but may require occasional travel to our satellite offices or client sites.

About Carbon Lighthouse: 
Carbon Lighthouse is on a mission to stop climate change by making it easy and profitable for building owners to eliminate carbon emissions caused by wasted energy. The company’s unique approach to Efficiency Production goes deep into buildings to uncover and continuously correct hidden inefficiencies that add up to meaningful financial value and carbon elimination that lasts. Since 2010, commercial real estate, educational, hospitality and industrial customers nationwide have chosen Carbon Lighthouse to enhance building comfort, increase net operating income and achieve their sustainability goals. Backed by notable investors, we are a team of 84 that highly values question asking, getting it done, integrity, and teamwork. We appreciate a fulfilling work-life balance, prize transparency and communication, hold ourselves to high standards of performance and professionalism, strive for dynamism and innovation, and support our team members’ professional development. Every person has both the opportunity and responsibility to make an impact on our growing organization.


Responsibilities:



  • Develop full stack web applications in an Agile environment to increase the speed and efficiency of Carbon Lighthouse process and grow our overall product offering

  • Participate in the full product development cycle, including brainstorming, architecting, release planning and estimation, implementing and iterating on code, coordinating with internal and external clients, internal code and design reviews, MVP and production releases, quality assurance, and product support.

  • Collaboratively work on simultaneous projects with multiple stakeholders, both on the software team and company-wide

  • Work in a team environment, expressing ideas and being open to those of others, to effectively drive cross-team solutions that have complex dependencies and requirements

  • Participate in, study, and improve the current Carbon Lighthouse process



Required Qualifications:



  • Dedication to Carbon Lighthouse's environmental mission

  • BS in Computer Science or similar technical field, or have demonstrated exceptional experience in technology environments

  • 5+ years of development experience

  • Proven track record of building production web applications using Python, or other comparable technology



Relevant Framework and Tool Experience

Proven experience with the majority of the following:



  • Server side web frameworks (e.g. Express, Flask, etc.)

  • Building user interfaces using JavaScript, HTML, CSS, and front end frameworks (e.g. Angular, React, etc.)

  • Developing applications backed by RDBMS or NoSQL data stores (e.g. MySQL, MongoDB, etc.)

  • Developing scalable, robust, and fault-tolerant REST services and microservices

  • Unit testing frameworks (e.g. Mocha, Chai, unittest, pytest, etc.)

  • Version control systems (e.g. Git, SVN, etc.)

  • Test driven development, continuous integration, and continuous deployment

  • AWS



Bonus Qualifications:



  • Data science background

  • HVAC systems background

  • HVAC controls software/hardware background

  • UX/UI design experience

  • Startup experience



Compensation and Benefits:



  • Salary + equity

  • Medical, dental, vision, and disability insurance

  • Generous vacation policy

  • Fully paid maternity and paternity leave benefits

  • Subsidized public transit and bike to work benefits

  • 401(k)



Carbon Lighthouse is an equal employment opportunity employer and considers qualified applicants without regard to gender, sexual orientation, gender identity, race, veteran or disability status. 

If you’re excited about our environmental mission and this looks like a fit, you should apply! Please fill out the brief application form and be prepared to submit three references at a later date.