In March's (def shef), we're looking at dependent typing and the kinds of problems it can help with, using the Idris language (https://www.idris-lang.org/)
We're all beginners with this language, so everyone is welcome! We'll be working through introductory materials (feel free to have a read around and a go yourself beforehand) and looking for problems that demonstrate the advantages of a dependently typed language.
This will be a hands-on session that isn't very structured. You'll need a laptop and it'll help if you've already installed Idris (alternatively you can try using an online REPL like https://tryidris.herokuapp.com/console)
Ever wondered what makes Clojure so appealing to work with? In February, Ed will show how the REPL and other unusual features of Clojure make it a productive and fun language to use!
He'll begin with a brief introduction to Clojure before diving into an interactive, live coding session.
If you want to brush up on your Clojure in the meantime, http://www.4clojure.com/ is a collection of many Clojure exercises to practice. Get set up quickly with https://clojure.org/guides/getting_started, abnd https://www.braveclojure.com/ is a great, free introduction to the language.
Last month's k-nearest neighbour implementation workshop, but we'll recap how a k-NN classifier works so no previous knowledge required.
This time, we'll look at how you'd use a kNN classifier to solve problems. We'll look at how we'd prepare data before we train and test the classifier. Then we'll use the results of our testing to try again, changing details to try and improve the performance. We'll focus on a tried-and-tested implementation like Python's scikit-learn kNN classifier (http://scikit-learn.org/stable/modules/neighbors.html) but you're welcome to use any language and classifier you like, including one you've built yourself.
We may also have a couple of short "lightning talks" from group members about how they're using or planning to use machine learning!
They're a set a problems designed to be a hands-on introduction to Monads, written in Haskell. A fairly basic knowledge of Haskell is required and the concepts you'll need to understand are listed on the Monad Challenges page. You'll also a computer and a working Haskell environment. If you're just getting started, there's plenty of tutorials out there like learnyouahaskell.com that should get you set up and cover off the concepts.
All experience levels welcome. I'm a Haskell beginner myself, so I'm doing these tutorials to prepare! If you're struggling with the language or the concepts, come along anyway and we'll help you!
Thanks to Mat for suggesting this topic!
(def shef) thanks Sky Betting and Gaming for kindly sponsoring this event. One of the Sunday Times' Top 100 companies to work for based in Leeds and Sheffield, there are roles available now.
We're going to have a go at the Monad Challenges next month. You'll need some Haskell (or a language with similar capabilities) so this month, we'll take on a less taxing challenge - understanding and implementing Finite State Automata in a functional style.
Finite state automata are a classic idea in computer science and and are able to recognise regular expressions.
If you've read Tom Stuart's "Understanding Computation", we'll be following chapter 3, but implementing in funcional styles instead of in OO ruby. It's a nice problem to try out a new langauge... maybe Haskell!
Bring our your code! We'll be looking at how functional programming techniques can help improve the code you work with everyday. Everything from shell scripts to Haskell is fair game, and everyone welcome - whether you're a functional programming ninja or you don't know exactly what functional programming is!
We'll also have some examples ready to get things moving, fork and PR https://github.com/defshef/35-refactor-my-code to add your favourite!
The k-nearest neighbours classifier is a classic machine learning algorithm. We'll use functional languages to implement a classifier, then use it with real-world public data sets to solve classification problems.
A suggested worksheet is available at https://github.com/defshef/dojo-knn
Once you've implemented and explored the algorithm, there are many details you can dive deeper on. How does your algorithm perform on larger data sets, and can you speed it up? How about plugging in different distance functions? If you're already an ML expert, you can look at a more advanced algorithm and compare performance with the kNN implementations.
The k-nearest neighbours classifier is a classic machine learning algorithm. We'll use functional languages to implement a classifier, then use it with real-world public data sets to solve classification problems.
Once you've implemented and explored the algorithm, there are many details you can dive deeper on. How does your algorithm perform on larger data sets, and can you speed it up? How about plugging in different distance functions? If you're already an ML expert, you can look at a more advanced algorithm and compare performance with the kNN implementations.