Hands-on FP workshops rooted in the biggest Scala conference in Europe.
Located in one of the most vibrant cities
in Central Europe.
Practical workshops led by world-class experts.
Meeting point for FP enthusiasts filled with networking and the after-party.
Monix is one of the most mature functional libraries for concurrency in Scala and we will explore it together!
We will start with learning the basics of Task and Observable and then use them to solve increasingly harder problems to accommodate new business requirements. We will also look at some tools that could help us with testing and debugging our application.
By the end of the workshop, any attendee should have a good idea how to use Monix and connect its pieces to write purely functional applications.
One of the biggest drawbacks of Akka Actors - the lack of type safety is finally addressed. Since the 2.6.0 release Akka Typed is stable and recommended as the default choice when working with the Akka stack.
Let's play with the new actors API, walk through the basics of behaviors, learn about actor lifecycle, supervision, scheduling and other fundamental concepts. Then we’ll compare it to the classical approach. Of course not everybody is lucky enough to work on a greenfield project, so we'll also show you how to introduce Akka Typed to existing sources and combine two worlds of actors within a single application.
During this workshop we'll explore important patterns and takeaways we’ve learned while introducing typed actors to our system.
Scala is a multi-paradigm programming language, incorporating features from both object-oriented programming and functional programming. But most object-oriented programmers who write code using Scala struggle to understand functional concepts, so they only use Scala as a better Java. In this workshop, you will learn how to write purely functional code using Scala, you will also learn about the common functional abstractions, including semigroup, monoid, functor, applicative, and monad.
At the end Wiem will cover the purpose behind the IO Monads and she will cover simple examples to get started using a functional programming Scala library called ZIO.
Thanks to the rise of functional effect systems like ZIO, fiber-based concurrency models have become popular in Scala. However, few people understand what fibers are, how fibers are different than JVM threads, what problems you can solve with fibers, and how to diagnose problems with fiber-based code.
In this two hour, hands-on workshop, ZIO contributor Rafael Saraiva will compare fibers and threads, explore various means of composing fibers, show how to use fibers to solve basic problems in concurrency and parallelism, and discuss ZIO-specific features like auto-supervision and programmable interruption. Lastly, attendees will learn how to debug async code using fiber-based execution traces and fiber dumps.
Attendees will leave the workshop with a clear understanding of the advantages of fibers over threads, with an excellent working knowledge of how to use them to their full potential (in new, unparalleled ways) and with new critical skills in diagnosing production issues for fiber-based code.
Writing tests is no fun! You have to deal with non-blocking async tests, finicky concurrent tests, nondeterministic flaky tests, and tests dependent on platform or environment settings. You have to test code that depends on external services, code that requires costly test fixtures, and code that needs mocks or test implementations. While existing Scala testing libraries have partial solutions to these problems, none of them has been built on a modern functional effect system like ZIO, which provides a fully-integrated solution to async, concurrency, flakiness, testability, and resource management.
In this two-hour workshop, Adam Fraser, one of the authors of ZIO Test, will show attendees how to use ZIO Test to solve common pain points that everyone has with Scala testing libraries—whether they use ZIO, another functional effect system, or no functional effect system at all! In addition to solving common pain points, attendees will learn how to test functional effects (include async and concurrent effects), how to test third-party software like Akka and Spark, and how to define their own composable test aspects to weave custom logic into individual tests or across suites of tests.
Discover how the functional side of Scala can make writing the trickiest tests fun again with ZIO Test!
Functional design is the science and art of using functional tools at your disposal—from functions to ADTs to algebraic structure—to craft domain-specific solutions to problems that are simple, composable, and testable. In this workshop with John A. De Goes, you will get a chance to hone your own skills in functional design, as you are presented with an imperative solution to a problem, and charged with building a functional alternative that shows the power of functional design to solve the everyday problems of commercial software development.
I want to show how to use property testing in scala test from the very beginning. We will start with simple generators, which comes with scalatest and write our own basing on those. After we go through basics, we will move to the concrete cases where I will show how properties can save us from bugs that are not easy to spot when writing unit tests.
For the workshops Michał requites basic scala and scalatest knowledge.
It might seem that defining HTTP APIs in Scala is a solved problem. Or is it? Tasks such as generating OpenAPI documentation or auto-generating clients have always been a challenge.
Let's fix this! We'll apply an approach of separating the **description** of a problem from its **interpretation**. This has proven to be a powerful tool in other domains (such as modelling concurrently running processes), so let's see how it works for HTTP APIs, using the [sttp tapir](https://github.com/softwaremill/tapir) library.
In the workshop we'll start by creating a simple application exposing an HTTP API. Next, we'll expose documentation using the Swagger UI. Then we'll proceed to more advanced topics, such as supporting custom types, validation, authentication, error handling and re-usable endpoint descriptions. Along the way, we'll discuss some of the design decisions and challenges, demonstrate the type-safety and how it impacts the "approachability" aspect of the API.
After the workshop, you'll not only be able to build a self-documenting Scala HTTP API, but you might also get a tapir mascot!
This workshop is about helping people (who already have the understanding of syntax!): how implicits work; how implicits are used to implement type classes (type class instance + extension methods); how to do some hardcoded derivation on build-in type; how to do in in a generic way: shapeless vs magnolia; how to debug some most common issues with implicits.
"Oli will start the discussion with the recursive data structure which everyone uses at every project!
We are going to talk about a list. We will copy-paste two functions `foldRight` and `unfold` from the Scala standard library and start our exploration from scratch.
How do these functions relate to recursion schemes? They are in fact mere special cases of more general functions that are named `cata` and `ana`!
To see convincing proof of this, please attend the workshop and follow the steps of refactoring.
We will also explore how recursion schemes are used in practice!"
The workshop will be centered on ONNX-Scala: An ONNX (Open Neural Network eXchange) API, Code Generator and Backend for Typeful, Numerically Generic, Functional Deep Learning in Scala.
Software transactional memory (STM) is an abstraction analogous to database transactions used to build safe, composable and modular code dealing with concurrency.
Attendees of this workshop will learn how to use STM and its accompanying data structures (e.g. TMap, TQueue, etc.) by solving some of the well-known concurrency challenges. In addition to that, we'll dive into the library implementation details, and discuss the impact of STM on program semantics and performance.
Develop a full stack application in a mixed language environment using scala.js, node.js express, react, zio, and python. The full stack approach will use a scala.js-first model and incorporates a python-based NLP based model developed (based on BERT's pre-trained model) in class to power an "intelligent" API .
The development environment will use a container-based model for both the backend and frontend. Containers allows precise control over the dependencies. The backend will contain a combined node.js express web server based on scala.js and zio. A separate python based API server will provide the NLP model. The frontend container will be a node.js express web server serving static assets including the frontend scala.js web application. The frontend will be based on react using the scalajs-reaction facade bundled with webpack and incorporate zio. The NLP model will be built using tensorflow/keras and use a standard dataset available in tensorflow. The use of zio will be around effects management and concurrency versus parallelism.
DAML, while being a relative newcomer in the blockchain world, is already stirring some waves within the community. With the ability to run on different physical blockchains (Hyperledger Sawtooth & Fabric, VMWare Blockchain and others), it includes a simple smart contract modelling language, functional & highly type safe programming lingo and native integration with Scala. With those impressive facilities, DAML is still one of the easiest blockchain environments to master.
During this workshop, you will learn: basic operating principles and philosophy of DAML; Smart contract modelling; Akka - based Scala integration and how to use it to create applications; Contract modelling patterns to solve problems common in blockchain applications; architecture patterns for different flavours of DLT.
The objective of the workshop is to get some theoretical and practical overview of the functional approach to IO-based programming. You will learn how to program real applications using Scala, Cats, Cats Effect, Cats MTL, Meow MTL, and others.
During the workshop, we will switch between quick introductions of the core features and longer step-by-step exercises. This will expose you to some features and tools needed to create and maintain production applications. We will implement a movie recommendation system.
This event is open for all programmers that know the basics of FP in Scala (immutability, pure functions, higher-order functions, basic type classes, and their instances).
During the workshop we will make a journey from a code in scala via generated bytecode, finally we will look at the assembly code that is executed. The goal of the workshop is to introduce tools such as javap, jitwach, jconsole, gc logs and others that help developers to understand what is happening ""under the hood"". We will check look into some optimizations performed by the Oracle JVM or Open JDK as well as make a brief check of project graal and features that are coming in the next versions of JVMs.
In practice, it rarely happens that a scala code needs some assembly analysis. But the knowledge of some low-level details and principles helps greatly in tracing the performance problems introduced on higher levels. Besides, it helps greatly once you know - there is no magic – it is just a stupid machine.
Have you ever wondered what is needed to gather all touristic information of an entire country? Or do you want to give your head a break while glancing at some beautiful pictures from Switzerland? This workshop is made for you. I will introduce you to the problems we had to tackle for the relaunch of www.myswitzerland.com and which solutions we tried.
You will learn about event-carried state transfer and idempotency. Challenges when mixing functional and object-oriented programming. Tasks for which functional programming was a natural match and which Scala features made the difference for our project. For each of those Scala features we will have solve exercises, partially as a group and partially everyone on there own. Last but not least, we will talk about why Either is such beautiful construct, that I devoted the title of this talk to it.
You are convinced side effects should be modeled in a functional way. You use a functional effect system such as Cats IO, Monix or ZIO. Now your program is a value and does not actually do anything unless you unsafeRun it at the end of the world. At that point in time all side effects and concurrency will manifest. How does that actually work?
During this workshop, we will build a minimal effect runtime similar to the one used in ZIO and interpreting the ZIO API. We will learn how effects can get executed and how parallelism and concurrency can be implemented using fibers aka green threads.
We will start from a very simple runtime, which does not implement any concurrency or advanced features and just provides a trampoline for stack safe execution of effects. From there, we will add more primitives until we can support concurrency including forking and joining, cooperative and preemptive scheduling, interruption, and modeling of, and recovering from errors.
The focus will be on explaining, how the individual primitives can be built and composed into a runtime. We will not focus on performance or feature completeness.
Basics of functional effect systems will be explained briefly, but this workshop is not meant as an introduction to IO monads / bifunctors. Some basic prior knowledge will be helpful for following the workshop.
The presenter had the idea for this workshop while fixing a bug in the ZIO runtime and realizing that it actually is quite easy to understand once you peel away a few advanced features and some low level optimizations.
So, you want to build a stream processing pipeline to do some serious data crunching on a massive data feed. Of course, it should be scalable and resilient to data loss. Easy, we'll use Kafka and Kubernetes of course, but having to manage and integrate with tools like Kafka is complex and time-consuming. Not to mention the YAML-hell that a platform like Kubernetes introduces.
Meet Cloudflow! Cloudflow is Lightbend’s latest product aimed at reducing the time required to create, package and deploy streaming data pipelines on Kubernetes. It offers powerful abstractions allowing you to define the most complex streaming applications. It also seamlessly integrates with streaming platforms like Akka Streams, Flink and Spark. In this workshop, we’ll introduce the reasoning and concepts behind Cloudflow. You’ll gain hands-on experience in creating and deploying your own Cloudflow project.
We will learn what is and how to use Kafka Connect to absorb data from various databases, we will cover how to use that ETL tool to make our data flow continously to Apache Kafka cluster. We will have fun with its REST API and databases queries observing changes on Kafka topics, talking mostly about what kind of mechanims do connectors use in different databases. Main part of workshop will be focused on Kafka Streams API and using it to manipulate data absorbed by Kafka Connect in real time. We will start from streaming basics then we will take tour through Kafka Streams API learning by example how that toolset work with our data, at the end we will put all that knowledge together to tackle some more complex situation still having fun. All of that of course using Scala.
The registration fee covers some of the organizational costs. In return, we'll deliver the best Scalar so far!