- Level: Senior
- Type: B2B
- English Level: Upper-Intermediate
- Location: Poland
- Skills: Java Kotlin SQL
Responsibilities
-
Develop and maintain efficient, scalable code and infrastructure capable of handling thousands of operations per second
-
Innovate and implement new features that elevate user experience and satisfaction
-
Monitor and analyze application metrics, responding promptly to alerts to ensure high availability and performance
-
Develop and maintain Spark ETL pipelines
-
Fine-tune services and databases for optimal cost-efficiency and ease of maintenance
-
Perform in-depth architecture and code reviews to uphold the highest standards of quality and engineering excellence
-
Continuously improve technologies, ensuring robust security, streamlined deployments, and adoption of the latest advancement
Requirements
-
6+ years of hands-on Java development experience
-
Proficiency in designing and building APIs using HTTP and REST
-
Strong knowledge of relational databases
-
Experience with AWS, Linux, and server-side development
-
Experience with Spark, ETL pipelines, bigdata storage formats or willingness to learn it
-
Understanding of distributed computing principles and algorithms
-
Familiarity with asynchronous patterns and messaging technologies
-
Expertise in Design Patterns for scalable, stable systems
-
Open to adopting modern approaches to efficient development, including AI tools
-
Spoken English. It may require discussing stakeholders requirements or troubleshooting an issue
Nice to Have
-
Experience with Python and other scripts
-
Familiarity with NoSQL databases, large datasets, or cloud storage like Amazon S3
Our Processes & Beliefs
-
DevOps is not a team – it is a culture
-
Continuous Deployment – multiple times daily
-
Service Oriented Architecture and APIs
-
Full test automation – we have no QA team
-
Clean Code – continuous improvement of the codebase as part of the daily tasks
Key Projects
Our team is developing a several key projects including:
-
AI Proxy – A universal API that provides unified access to all supported LLM providers
-
AI Translations – A suite of services for generating translations and assessing translation quality
-
Data Pipelines – Data pipelines of varying complexity that make data from all our services available in the Data Lake
-
Data Infrastructure – The Data Lake itself, along with reporting and analytics tools
Technology Stack:
-
Core Technologies: Java, Kotling
-
Databases and messaging: MySQL, DynamoDB, RabbitMQ, Kafka
-
Cloud and Infrastructure: AWS (EC2, ECS, S3, RDS, Lambda), Linux, Terraform, Docker
-
Data Engineering: Apache Airflow, Spark, Parquet, Iceberg
What Matters to Smartling?
-
Empower our clients to grow their businesses while you grow as a professional and individual.
-
Learn new skills and advance your career through real-world challenges and cutting-edge tools.
-
Be part of a small, energetic, and collaborative team that values your contributions.
-
Enjoy a balanced work-life environment with flexible PTO.
-