Messages de Rogue Scholar

language
Publié in Math ∩ Programming
Auteur Jeremy Kun

I added Twitter syndication, and because I have nothing to test it with I’ll share some random life updates. My daughter was born recently, which means I’m on paternity leave for a few months. Hopefully in the liminal hours of sleep training, I’ll have some time to work on my book. Or at least catch up on reading. I finally published my grandmother’s autobiography.

Publié in Math ∩ Programming
Auteur Jeremy Kun

Steven Clontz informed me of an effort he’s involved in called code4math. It’s described as a professional organization for the advancement of mathematical research through building non-research software infrastructure. By that he means, for example, writing software packages like Macaulay2 or databases of mathematical objects that other researchers can use to do their research.

Publié in Math ∩ Programming
Auteur Jeremy Kun

POSSE stands for Publish (on your) Own Site, Syndicate Elsewhere. I first heard about it from Cory Doctorow. I’m experimenting with automation to convert posts tagged shortform into Mastodon threads (I’m mathstodon.xyz/@j2kun). I’m using Hugo as a static site generator, with the source a (private) GitHub repository, and Netlify for deployments.

Publié in Math ∩ Programming
Auteur Jeremy Kun

I’ve been learning recently about how to approximate functions by low-degree polynomials. This is useful in fully homomorphic encryption (FHE) in the context of “arithmetic FHE” (see my FHE overview article), where the computational model makes low-degree polynomials cheap to evaluate and non-polynomial functions expensive or impossible. In browsing the state of the art I came across two interesting things.

Publié in Math ∩ Programming
Auteur Jeremy Kun

About two years ago, I switched teams at Google to focus on fully homomorphic encryption (abbreviated FHE, or sometimes HE). Since then I’ve got to work on a lot of interesting projects, learning along the way about post-quantum cryptography, compiler design, and the ins and outs of fully homomorphic encryption.

Publié in Math ∩ Programming
Auteur Jeremy Kun

It’s April Cools! Last year I wrote about friendship bracelets and the year before about cocktails. This year it’s parenting. Parenting articles are a dime a dozen and always bury the lede behind a long story. I’ll skip that. How to think about your child and your role as a parent These are framing devices. Concrete things to do to work toward these are in the next section.

Publié in Math ∩ Programming
Auteur Jeremy Kun

There’s a family of tabletop games that are based directly on a nontrivial mathematics problem. As a casual and fun way to inaugurate my new blog (migrated from Wordpress to Hugo, after my work on getting better LaTeX mathmode support in Hugo), I thought I’d write a short listicle about them, so that I have a place to add more as I find them, as well as give the shortest canonical description of the associated math problem.

Publié in Math ∩ Programming
Auteur Jeremy Kun

Table of Contents In this article we’ll implement a global optimization pass, and show how to use the dataflow analysis framework to verify the results of our optimization. The code for this article is in this pull request, and as usual the commits are organized to be read in order. The noisy arithmetic problem This demonstration is based on a simplified model of computation relevant to the HEIR project.

Publié in Math ∩ Programming
Auteur Jeremy Kun

Table of Contents In the last article we lowered our custom poly dialect to standard MLIR dialects. In this article we’ll continue lowering it to LLVM IR, exporting it out of MLIR to LLVM, and then compiling to x86 machine code. The code for this article is in this pull request, and as usual the commits are organized to be read in order. Defining a Pipeline The first step in lowering to machine code is to lower to an “exit dialect.

Publié in Math ∩ Programming
Auteur Jeremy Kun

Table of Contents In previous articles we defined a dialect, and wrote various passes to optimize and canonicalize a program using that dialect. However, one of the main tenets of MLIR is “incremental lowering,” the idea that there are lots of levels of IR granularity, and you incrementally lower different parts of the IR, only discarding information when it’s no longer useful for optimizations.