WE LOOK FOR TALENTS

We recruit highly-talented professionals in order to build multicultural teams with a clear view of the real-world problems that businesses are facing nowadays. We promote an open and collaborative working environment complemented by innovation and proactivity.

Hiring Spark Analysts & Developers

If you have an analytical mindset and like challenges. If you are passionate about Big Data technologies and have experience in Spark, there is room for your professional development in KEEDIO.

 

Skills & Experience

We are currently growing and seek professionals with at least 2 years of experience in software development and 1 year with Spark in Java & Scala.

 

Must-haves:
· Expertise in Big Data projects
· Languages: Java & Scala
· Spark
· Version control and quality
· SQL and NoSQL databases
· Hibernate
· Spring
· Maven
Nice-to-haves:
· Other languages such as Python or JavaScript
· AngularJS and/or NodeJS
· Big Data: Flink, Storm, Flume, etc…
· Cloud: Amazon, Azure, Openstack, etc…
· Systems
· Agile methodologies

 

 

Hiring Java J2EE Analysts & Developers

If you have an analytical mindset and like challenges. If you are passionate about Big Data technologies and have experience in Java J2EE, there is room for your professional development in KEEDIO.

 

Skills & Experience

We are currently growing and seek professionals with at least 3 years of experience in software development in Java J2EE.

 

Must-haves:
· Professional experience in Java J2EE
· Version and quality control tools (Github, Jenkins, Sonar)
· Databases SQL and NoSQL
· SQL and NoSQL databases
· Hibernate
· Spring
· Maven
Nice-to-haves:
· Other languages such as Python or JavaScript
· AngularJS and/or NodeJS
· Big Data: Flink, Storm, Flume, Hadoop, etc…
· Cloud: Amazon, Azure, Openstack, etc…
· Systems
· Agile methodologies

 

 

Hiring Big Data Developers Scala

If you are a passionate about Big Data technologies and have experience in Scala, there is room for your profesional development in KEEDIO.

 

 

Must-haves:
  • A bachelor’s degree in Computer Science or Engineering.
  • Strong programming skills, ideally in Python, Scala, Java or a similar language.
  • Experience in GIT and github.com. Used to review source code in PRs. Good CI practices (using Jenkins pipelines).
  • Agile methodologies.
  • Experience with Spark for batch processing. Knowledge of different Big Data file formats (AVRO, parquet, etc.). Experience with cloud storage services like S3 or Azure Blob Storage.
  • Knowledge of traditional Big Data tools (MapReduce, Hadoop, Pig, Hive, Impala, etc.)
  • Knowledge of SQL at user level.

 

Nice-to-haves:
  • Knowledge of data ETL tools like NiFi or Pentaho.
  • Knowledge of cloud systems: Windows Azure, Amazon WS.
  • Experience with streaming systems, such as Kafka.
  • Knowledge of DevOps tools like Ansible, Puppet, etc.
  • Experience with NoSQL databases, such as Couchbase, HBase, Cassandra, MongoDB.
  • Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O.

 

 

Hiring Linux Systems Engineers

If you are a passionate about Big Data technologies and have experience in Scala, there is room for your profesional development in KEEDIO.

 

 

Must-haves:
  • Experience in public clouds.
  • Demonstrable experience in DevOps automation and deployment technologies (Puppet and Ansible).
  • Experience in version control tools.
  • Knowledge in scripting.
  • Experience in Linux / Unix systems.
  • Knowledge of networking.
  • Experience in maintenance and management of productive and non-productive environments.
Nice-to-haves:
  • Knowledge on Red Hat technologies (OpenStack and / or OpenShift).
  • Knowledge in platforms and Big Data environments.
  • Knowledge in security.

 

 

 

 

 

Hiring Cloud Systems Engineers (Azure)

If you are passionate about cloud technologies, KEEDIO is the right place to join.

Skills & Experience

We’re seeking for a cloud system engineer with at least two years of experience with Microsoft Azure. This position requieres the candidate to interact with multiple technical teams, architects, security officers, managers and business users; review and document workloads, system dependencies and business requirements; map workloads to the capabilities of Microsoft Azure for public, private and hybrid clouds.

 

 

Must-haves:
  • Designing and configuring Azure Virtual Networks (VNets), subnets, Azure network settings, DHCP address blocks, DNS settings, security policies and routing.
  • Designing Network Security Groups (NSGs) to control inbound and outbound access to network interfaces (NICs), VMs and subnets.
  • Deploying Azure IaaS virtual machines (VMs) and Cloud services (PaaS role instances) into secure VNets and subnets.
  • Exposing Virtual machines and cloud services in the VNets to the Internet using Azure External Load Balancer and Application Gateway.
  • Implementation Azure Active Directory for single sign on, authentication, authorization and Azure Role-based Access Control (RBAC).
  • Implementation of hybrid cloud architectures to include on-premise resources, networks, security, IaaS, PaaS and SaaS architectures.
  • Knowledge of Microsoft Azure ExpressRoute.
Nice-to-haves:
  • Experience with Azure Key Vault
  • Experience with different Azure data storage solutions: Azure SQL Server, Document DB, Blob Storage.
  • Automation of Azure deployments utilizing PowerShell.
  • Azure Container Service.

 

 

 

 

 

 

KEEDIO’s Internship Program 2017/18

Location: Ciudad de la Imagen (KEEDIO’s premises)

Number of positions: 8

Position type: Part Time

Duration: 6 months

Total Stipend: € 3600

 

KEEDIO is a company specialized in BigData backed by 10 years of previous experience in distributed computing: HPC, HTC, grid computing… Currently, our projects range from architecture deployment to analytics where we are hands-on with established technologies, tools and also with KEEDIO’s own platform that services a range of institutions.

KEEDIO offers on-the-job training in projects of BigData technologies and innovation where you will acquire competences in day-to-day work and challenges faced nowadays in the vibrant world of data and analytics. The traineeship program aims to become the stepping-stone in your future career in BigData and Analytics whether you decide to move on elsewhere or stay with us in the future.

Three profiles of technical candidates are sought eager to learn and develop their career in BigData and Analytics: system engineers, development engineers, and data engineers/scientists. System engineers and developers will focus on ongoing projects both learning and contributing based on their skills and knowledge. Data engineers/scientists will be provided with datasets from where insights or prototypes are expected. You are encouraged to apply whether you are the rock-star of your class or you have a ground in the required technical knowledge, a genuine motivation and attitude to learn and explore.

 

Profiles
Must- and Nice-to-haves
Tasks
System engineer 
Academic degree  in/related to Computer Science focused on systems and architecture
OS administration and distributed systems
Highly motivated, eager to learn and curious about new technologies
Integration, deployment and administration of BigData architectures. Scripting, monitoring, and security.
Development engineer
Academic degree in/related to Computer Science focused on development
Software development using Java, Python, or SQL.
Highly motivated, eager to learn and curious about new technologies
Software development in BigData environments. Learn new frameworks and languages. Basic analytics.
Data engineer/scientist
Academic degree: postgrad in engineering or sciences with analytical background and, at least, proficiency in one language.
Knowledge of data analysis, modeling, inferential and descriptive statistics, and supervised and non-supervised learning. Data analysis in a language such as Python, R, or MATLAB.
Communication skills. Motivated and eager to learn.
Nice-to-haves: MOOCs, courses out of official academic CVs, specialization courses, etc. Experience in projects (in Academia or not) analyzing data with mathematical/statistical tools.
Development of the full data-analysis cycle –collection, cleaning, exploratory analysis, modeling and outcome presentation—using BigData and BI technologies.

The selected candidates will be part of a young, dynamic team made up by professionals with diverse backgrounds closely connected to cutting-edge technologies. They will also enjoy a flexible schedule, competitive salary, and social benefits.

 

Contact us or send your CV to:

rrhh@keedio.com