You are here

Planet GNOME

Subscribe to Feed Planet GNOME
Planet GNOME - https://planet.gnome.org/
Përditësimi: 19 orë 32 min më parë

Jussi Pakkanen: What's cooking with Pystd, the experimental C++ standard library?

Sht, 14/02/2026 - 6:54md

Pystd is an experiment on what a C++ standard library without any backwards compatibility requirements would look like. It's design goals are in order of decreasing priority:

  • Fast build times
  • Simplicity of implementation
  • Good performance
 It also has some design-antigoals:

  • Not compatible with the ISO C++ standard library
  • No support for weird corner cases like linked lists or types that can't be noexcept-moved
  • Do not reinvent things that are already in the C standard library (though you might provide a nicer UI to them)
Current status

There is a bunch of stuff implemented, like vector, several string types, hashmap, a B-tree based ordered map, regular expressions, unix path manipulation operations and so on. The latest addition has been sort algorithms, which include merge sort, heap sort and introsort.

None of these is "production quality". They will almost certainly have bugs. Don't rely on them for "real work". 

The actual library consists of approximately 4800 lines of headers and 4700 lines of source. Building the library and all test code on a Raspberry Pi using a single core takes 13 seconds. With 30 process invocations this means approximately 0.4 seconds per compilation.

For real world testing we have really only one data point, but in it build time was reduced by three quarters, the binary became smaller and the end result ran faster.

Portability

The code has been tested on Linux x86_64 and aarch64 as well as on macOS. It currently does not work with Visual Studio which has not implemented support for pack indexing yet.

Why should you consider using it?

Back in the 90s and 00s (I think) it was fashionable to write your own C++ standard library implementation. Eventually they all died and people moved to the one that comes with their compiler. Which is totally reasonable. So why would you now switch to something else?

For existing C++ applications you probably don't want to. The amount of work needed for a port is too much to be justified in most cases.

For green field projects things are more interesting. Maybe you just want to try something new just for the fun of it? That is the main reason why Pystd even exists, I wanted to try implementing the core building blocks of a standard library from scratch.

Maybe you want to provide "Go style" binaries that build fast and have no external deps? The size overhead of Pystd is only a few hundred k and the executables it yields only depend on libc (unless you use regexes, in which case they also depend on libpcre, but you can static link it if you prefer).

Resource constrained or embedded systems might also be an option. Libstdc++ takes a few megabytes. Pystd does require malloc, though (more specifically it requires aligned alloc) so for the smallest embedded targets you'd need to use something like the freestanding library. As an additional feature Pystd permits you to disable parts of the library that are not used (currently only regexes, but could be extended to things like threading and file system).

Compiler implementers might choose to test their performance with an unusual code base. For example GCC compiles most Pystd files in a flash but for some reason the B-tree implementation takes several seconds to build. I don't really know why because it does not do any heavy duty metaprogramming or such.

It might also be usable in teaching as a fairly small implementation of the core algorithms used today. Assuming anyone does education any more as opposed to relying on LLMs for everything.


Cassidy James Blaede: How I Designed My Cantina Birthday Party

Sht, 14/02/2026 - 1:00pd

Ever since my partner and I bought a house several years ago, I’ve wanted to throw a themed Star Wars party here. We’ve talked about doing a summer movie showing thing, we’ve talked about doing a Star Wars TV show marathon, and we’ve done a few birthday parties—but never the full-on themed party that I was dreaming up. Until this year!

For some reason, a combination of rearranging some of our furniture, the state of my smart home, my enjoyment of Star Wars: Outlaws, and my newfound work/life balance meant that this was the year I finally committed to doing the party.

Pitch

For the past few years I’ve thrown a two-part birthday party: we start out at a nearby bar or restaurant, and then head to the house for more drinks and games. I like this format as it gives folks a natural “out” if they don’t want to commit to the entire evening: they can just join the beginning and then head out, or they can just meet up at our house. I was planning to do the same this year, but decided: let’s go all-in at the house so we have more time for more fun. I knew I wanted:

  1. Trivia! I organized a fun little Star Wars trivia game for my birthday last year and really enjoyed how nerdy my friends were with it, so this year I wanted to do something similar. My good friend Dagan volunteered to put together a fresh trivia game, which was incredible.

  2. Sabacc. The Star Wars equivalent to poker, featured heavily in the Star Wars: Outlaws game as well as in Star Wars: Rebels, Solo: A Star Wars Story, and the Disney Galactic Starcruiser (though it’s Kessel sabacc vs. traditional sabacc vs. Corellian spike vs. Coruscant shift respectively… but I digress). I got a Kessel sabacc set for Christmas and have wanted to play it with a group of friends ever since.

  3. Themed drinks. Revnog is mentioned in Star Wars media including Andor as some sort of liquor, and spotchka is featured in the New Republic era shows like The Mandalorian and The Book of Boba Fett. There isn’t really any detail as to what each tastes like, but I knew I wanted to make some batch cocktails inspired by these in-universe drinks.

  4. Immersive environment. This meant smart lights, music, and some other aesthetic touches. Luckily over the years I’ve upgraded my smart home to feature nearly all locally-controllable RGB smart bulbs and fixtures; while during the day they simply shift from warm white to daylight and back, it means I can do a lot with them for special occasions. I also have networked speakers throughout the house, and a 3D printer.

About a month before the party, I got to work.

Aesthetic

For the party to feel immersive, I knew getting the aesthetic right was paramount. I also knew I wanted to send out themed invites to set the tone, so I had to start thinking about the whole thing early.

Star Wars: Outlaws title screen

Star Wars: Outlaws journal UI

Since I’d been playing Star Wars: Outlaws, that was my immediate inspiration. I also follow the legendary Louie Mantia on Mastodon, and had bought some of his Star Wars fonts from The Crown Type Company, so I knew at least partially how I was going to get there.

Initial invite graphic (address censored)

For the invite, I went with a cyan-on-black color scheme. This is featured heavily in Star Wars: Outlaws but is also an iconic Star Wars look (“A long time ago…”, movie end credits, Clone Wars title cards, etc.). I chose the Spectre font as it’s very readable but also very Star Wars. To give it some more texture (and as an easter egg for the nerds), I used Womprat Aurebesh offset and dimmed behind the heading. The whole thing was a pretty quick design, but it did its job and set the tone.

Website

I spent a bit more time iterating on the website, and it’s a more familiar domain for me than more static designs like the invite was. I especially like how the offset Aurebesh turned out on the headings, as it feels very in-universe to me. I also played with a bit of texture on the website to give it that lo-fi/imperfect tech vibe that Star Wars so often embraces.

For the longer-form body text, I wanted something even more readable than the more display-oriented fonts I’d used, so I turned to a good friend: Inter (also used on this site!). It doesn’t really look like Inter though… because I used almost every stylistic alternate that the font offers—explicitly to make it feel legible but also… kinda funky. I think it worked out well. Specifically, notice the lower-case “a”, “f”, “L”, “t”, and “u” shapes, plus the more rounded punctuation.

Screenshot of my website

I think more people should use subdomains for things like this! It’s become a meme at this point that people buy domains for projects they never get around to, but I always have to remind people: subdomains are free. Focus on making the thing, put it up on a subdomain, and then if you ever spin it out into its own successful thing, then you can buy a flashy bare domain for it!

Since I already owned blaede.family where I host extended family wishlists, recipes, and a Mastodon server, I resisted the urge to purchase yet another domain and instead went with a subdomain. cantina.blaede.family doesn’t quite stay totally immersive, but it worked well enough—especially for a presumably short-lived project like this.

Environment

Once I had the invite nailed down, I started working on what the actual physical environment would look like. I watched the bar/cantina scenes from A New Hope and Attack of the Clones, scoured concept art, and of course played more Outlaws. The main thing I came away thinking about was lighting!

Lighting

The actual cantinas are often not all that otherworldly, but lighting plays a huge role; both in color and the overall dimness with a lot of (sometimes colorful) accent lighting.

So, I got to work on setting up a lighting scene in Home Assistant. At first I was using the same color scheme everywhere, but I quickly found that distinct color schemes for different areas would feel more fun and interesting.

Lounge area

For the main lounge-type area, I went with dim orange lighting and just a couple of green accent lamps. This reminds me of Jabba’s palace and Boba Fett, and just felt… right. It’s sort of organic but would be a somewhat strange color scheme outside of Star Wars. It’s also the first impression people will get when coming into the house, so I wanted it to feel the most recognizably Star Wars-y.

Kitchen area

Next, I focused on the kitchen, where people would gather for drinks and snacks. We have white under-cabinet lighting which I wanted to keep for function (it’s nice to see what color your food actually is…), but I went with a bluish-purple (almost ultaviolet) and pink.

Coruscant bar from Attack of the Clones

While this is very different from a cantina on Tatooine, it reminded me of the Coruscant bar we see in Attack of the Clones as well as some of the environments in The Clone Wars and Outlaws. At one point I was going to attempt to make a glowing cocktail that would luminesce under black light—I ditched that, but the lighting stayed.

Dining room sabacc table

One of the more important areas was, of course, the sabacc table (the dining room), which is adjacent to the kitchen. I had to balance ensuring the cards and chips are visible with that dim, dingy, underworld vibe. I settled on actually adding a couple of warm white accent lights (3D printed!) for visibility, then using the ceiling fan lights as a sabacc round counter (with a Zigbee button as the dealer token).

3D printed accent light

Lastly, I picked a few other colors for adjacent rooms: a more vivid purple for the bathroom, and red plus a rainbow LED strip for my office (where I set up split-screen Star Wars: Battlefront II on a PS2).

Office area

I was pretty happy with the lighting at this point, but then I re-watched the Mos Eisley scenes and noticed some fairly simple accent lights: plain warm white cylinders on the tables.

I threw together a simple print for my 3D printer and added some battery-powered puck lights underneath: perfection.

First test of my cylinder lights Music

With my networked speakers, I knew I wanted some in-universe cantina music—but I also knew the cantina song would get real old, real fast. Since I’d been playing Outlaws as well as a fan-made Holocard Cantina sabacc app, I knew there was a decent amount of in-universe music out there; luckily it’s actually all on YouTube Music.

I made a looooong playlist including a bunch of that music plus some from Pyloon’s Saloon in Jedi: Survivor, Oga’s Cantina at Disney’s Galaxy’s Edge, and a select few tracks from other Star Wars media (Niamos!).

Sabacc

A big part of the party was sabacc; we ended up playing several games and really getting into it. To complement the cards and dice (from Hyperspace Props), I 3D printed chips and tokens that we used for the games.

3D printed sabacc tokens and chips

We started out simple with just the basic rules and no tokens, but after a couple of games, we introduced some simple tokens to make the game more interesting.

Playing sabacc

I had a blast playing sabacc with my friends and by the end of the night we all agreed: we need to play this more frequently than just once a year for my birthday!

Drinks

I’m a fan of batch cocktails for parties, because it means less time tending a bar and more time enjoying company—plus it gives you a nice opportunity for a themed drink or two that you can prepare ahead of time. I decided to make two batch cocktails: green revnog and spotchka.

Bottles of spotchka and revnog

Revnog is shown a few times in Andor, but it’s hard to tell what it looks like—one time it appears to be blue, but it’s also lit by the bar itself. When it comes to taste, the StarWars.com Databank just says it “comes in a variety of flavors.” However, one character mentions “green revnog” as being her favorite, so I decided to run with that so I could make something featuring objectively the best fruit in the galaxy: pear (if you know, you know).

My take on green revnog

After a lot of experimenting, I settled on a spiced pear gin drink that I think is a nice balance between sweet, spiced, and boozy. The simple batch recipe came out to: 4 parts gin, 1 part St. George’s Spiced Pear Liqueur, 1 part pear juice, and 1 part lemon juice. It can be served directly on ice, or cut with sparkling water to tame it a bit.

Spotchka doesn’t get its own StarWars.com Databank entry, but is mentioned in a couple of entries about locations from an arc of The Mandalorian. All that can be gleaned is that it’s apparently glowing and blue (Star Wars sure loves its blue drinks!), and made from “krill” which in Star Wars is shrimp-like.

My take on spotchka

I knew blue curaçao would be critical for a blue cocktail, and after a bit of asking around for inspiration, I decided coconut cream would give it a nice opacity and lightness. The obvious other ingredients for me, then, were rum and pineapple juice. I wanted it to taste a little more complex than just a Malibu pineapple, so I raided my liquor supply until I found my “secret” ingredient: grapefruit vodka. Just a tiny bit of that made it taste really unique and way more interesting! The final ratios for the batch are: 4 parts coconut rum, 2 parts white rum, 2 parts blue curaçao, 1 part grapefruit vodka, 2 parts pineapple juice, 1 part coconut cream. Similar to the revnog, it can be served directly on ice or cut with sparkling water for a less boozy drink.

Summary

Over all I had a blast hanging out, drinking cocktails, playing sabacc, and nerding out with my friends. I feel like the immersive-but-not-overbearing environment felt right; just one friend (the trivia master!) dressed up, which was perfect as I explicitly told everyone that costumes were not expected but left it open in case anyone wanted to dress up. The trivia, drinks, and sabacc all went over well, and a handful of us hung around until after 2 AM enjoying each other’s company. That’s a win in my book. :)

Martin Pitt: Revisiting Google Cloud Performance for KVM-based CI

Pre, 13/02/2026 - 1:00pd
Summary from 2022 Back then, I evaluated Google Cloud Platform for running Cockpit’s integration tests. Nested virtualization on GCE was way too slow, crashy, and unreliable for our workload. Tests that ran in 35-45 minutes on bare metal (my laptop) took over 2 hours with 15 failures, timeouts, and crashes. The nested KVM simply wasn’t performant enough. On today’s Day of Learning, I gave this another shot, and was pleasantly surprised.

Olav Vitters: GUADEC 2026 accommodation

Enj, 12/02/2026 - 9:51pd

One of the things that I appreciate in a GUADEC (if available) is a common accommodation. Loads of attendees appreciated the shared accommodation in Vilanova i la Geltrú, Spain (GUADEC 2006). For GUADEC 2026 Deepesha announced one recommended accommodation, a student’s residence. GUADEC 2026 is at the same place as GUADEC 2012, meaning: A Coruña, Spain. I didn’t go to the 2012 one though I heard it also had a shared accommodation. For those wondering where to stay, suggest the recommended one.

Andy Wingo: free trade and the left

Mër, 11/02/2026 - 11:20md

I came of age an age ago, in the end of history: the decade of NAFTA, the WTO, the battle of Seattle. My people hated it all, interpreting the global distribution of production as a race to the bottom leaving post-industrial craters in its wake. Our slogans were to buy local, to the extent that participation in capitalism was necessary; to support local businesses, local products, local communities. What were trade barriers to us? One should not need goods from far away in the first place; autarky is the bestarky.

Now, though, the vibe has shifted. If you told me then that most leftists would be aghast at the cancelling of NAFTA and subsequently relieved when it was re-instated as the USMCA, I think I would have used that most 90s of insults: sellouts. It has been a perplexing decade in many ways.

The last shoe to drop for me was when I realized that, given the seriousness of the climate crisis, it is now a moral imperative everywhere to import as many solar panels as cheaply as possible, from wherever they might be made. Solar panels will liberate us from fossil fuel consumption. China makes them cheapest. Neither geopolitical (“please believe China is our enemy”) nor antiglobalist (“home-grown inverters have better vibes”) arguments have any purchase on this reality. In this context, any trade barrier on Chinese solar panels means more misery for our grandchildren, as they suffer under global warming that we caused. Every additional half-kilowatt panel we install is that much less dinosaur that other people will choose to burn.

But if trade can be progressive, then when is it so, and when not? I know that the market faithful among my readers will see the question as nonsense, but that is not my home church. We on the left pride ourselves in our critical capacities, in our ability to challenge common understandings. However, I wonder about the extent to which my part of the left has calcified and become brittle, recycling old thoughts that we thunk about new situations that rhyme with things we remember, in which we spare ourselves the effort of engagement out of confidence in our past conclusions, but in doing so miss the mark. What is the real leftist take on the EU-Mercosur agreement, anyway?

This question has consumed me for some weeks. To help me think through it, I have enlisted four sources: Quinn Slobodian’s Globalists (2018), an intellectual history of neoliberalism, from the first world war to the 1990s; Marc-William Palen’s Pax Economica (2024), a history of arguments for free trade “from the left”, from the 1840s to the beginnings of Slobodian’s book; Starhawk’s Webs of power (2002), a moment-in-time, first-person-plural account of the antiglobalization movement between Seattle and 9/11; and finally Florence Reese’s Which side are you on? (1931), as recorded by the Weavers, a haunting refrain that dogs me all through this investigation, reminding me that the question is visceral and not abstract.

stave 1: mechanism

Before diving in though, I should admit that I don’t actually understand the basics, and it is not for a lack of trying.

Let’s work through some thought experiments together. Let’s imagine you are in France and want to buy nails. To produce nails, a capitalist has to purchase machines and buildings and steel, has to pay the workers to operate the machines, then finally the nails have to get boxed and shipped to, you know, me. I could buy French-made nails, apparently there is one clouterie left. Even assuming that Creil is a hotbed of industrial development and that its network of suppliers and machine operators is such that they can produce nails very efficiently, the wage in France is higher than the wage in, say, China, or Vietnam, or Turkey. The cost of the labor that goes into each nail is more.

So if I have 1000€ that I would like to devote to making a thing in France, I might prefer to obtain my nails from Turkey for 50€ rather than from France for 100€, because it leaves me more left over in which I can do my thing.

Perhaps France sees this as a travesty; in response, France imposes a tariff of 100% on imported nails. In that way, it might be the same to me as a consumer whether the nails were from Turkey or France. But this imposes a cost on me; I have less left over with which to do my thing. In exchange, the French clouterie gets to keep on going, along with the workers that work there, their families and the community around the factory, the suppliers of the factory, and so on.

But do I actually care that my nails are made in France? Should the country care that it has a capacity to make nails? When I was 25, I thought so, that it would be better if everything were made locally. Perhaps that was a distortion from having grown up in the US, in which we consider goods originating between either ocean as “ours”; smaller polities cannot afford that conceit. There may or may not be a reason that France might want nails to be made locally; it is a political question, but one that is not without tradeoffs. If we always choose the local option, we will ultimately do worse overall than a country that is happy to import; in the limit case, some goods like coffee would be unobtainable, with a price floor established by the cost to artificially re-create in France the mountain climate of Perú.

Let us accept, however, the argument that free trade improves overall efficiency, yielding productivity gains that ultimately get spread around society, making everyone’s lives better. We on the left are used to being more critical than constructive, but we do ourselves a disservice if we let our distaste of capitalism prevent us from appreciating how it works. Reading Marx’s Capital, for example, one can’t help but appreciate the awe which he has towards the dynamism of capitalism, apart from analysis and critique. In that spirit of dynamic awe, therefore, let us assume that free trade leads to higher productivity. How does it do this?

Here, I think, we must recognize a kind of ruthless cruelty. The one clouterie survives because of local demand; it can’t compete on the global market. It will fold eventually. Its workers will be laid off, its factories vacated, its town... well Creil is on the outskirts of Paris, it will live on, but less as a center in and of itself.

The nail factory will try to hang on for a while, though. It will start by putting downward pressure on wages, and extract other concessions from workers, in the name of economic necessity. Otherwise, the boss says, we move the factory to Tunisia. This operates also on larger levels, with the chamber of commerce (MEDEF) arguing for regulatory “simplifications”, new kinds of contracts with fewer protections, and so on. To the extent that a factory’s product is a commodity – whether the nails are Malaysian or French doesn’t matter – to that extent, free trade is indeed a race to the bottom, allowing mobile capital to play different jurisdictions off each other. The same goes for quality standards, environmental standards, and similar ways in which countries might choose to internalize what are elsewhere externalities.

I am a sentimental dude and have always found this sort of situation to be quite sad. Places that are on the up and up have a yuppie, rootless vibe, while those that are in decline have roots but no branches. Economics washes its hands of vibes, though; if it’s not measurable in euros, it doesn’t exist. What is the value of 19th-century India’s textile industry against the low price of Manchester-milled cloth? Can we put a number on it? We did, of course; zero is a number after all.

Finally, before I close today’s missive, a note on distribution. Most macro-economics is indifferent as regards distribution; gross domestic product is assessed relative to a country, not its parts. But if nails are producible for 50€ in Turkey, shipping included, that doesn’t mean they get sold at that price in France; if they sell for 70€, the capitalist pockets the change. That’s the base of surplus value: the owner pays the input costs, and the input cost of a worker is what the worker needs to make a living (rent, food, etc), but the that cost is less than the value of what the worker produces. That productivity goes up doesn’t necessarily mean that it makes it to the producers; that requires a finer analysis.

to be continued

For me, the fundamental economic argument for free trade is not unambiguously positive. Yes, lower trade barriers should leave us all with more left over with which we can do cool things. But the mechanism is cruel, the benefits accrue unequally, and there is value in producing communities that is not captured in prices.

In my next missive, we go back to the 19th century to see what Marx and Cobden had to say about the topic. Until then, happy trading!

Asman Malika: Career Opportunities: What This Internship Is Teaching Me About the Future

Hën, 09/02/2026 - 3:04md

 Before Outreachy, when I thought about career opportunities, I mostly thought about job openings, applications, and interviews. Opportunities felt like something you wait for, or hope to be selected for.

This internship has changed how I see that completely.

I’m learning that opportunities are often created through contribution, visibility, and community, not just applications.

Opportunities Look Different in Open Source

Working with GNOME has shown me that contributing to open source is not just about writing code, it’s about building a public track record. Every merge request, every review cycle, every improvement becomes part of a visible body of work.

Through my work on Papers: implementing manual signature features, fixing issues, contributing to Poppler codebase and now working on digital signatures, I’m not just completing tasks. I’m building real-world experience in a production codebase used by actual users.

That kind of experience creates opportunities that don’t always show up on job boards:

  • Collaborating with experienced maintainers
  • Learning large-project workflows
  • Becoming known within a technical community
  • Developing credibility through consistent contributions
Skills That Expand My Career Options

This internship is also expanding what I feel qualified to do.I’m gaining experience with:

  • Building new features
  • Large, existing codebases
  • Code review and iteration cycles
  • Debugging build failures and integration issues
  • Writing clearer documentation and commit messages
  • Communicating technical progress

These are skills that apply across many roles, not just one job title. They open doors to remote collaboration, open-source roles, and product-focused engineering work.

Career Is Bigger Than Employment

One mindset shift for me is that career is no longer just about “getting hired.” It’s also about impact and direction.

I now think more about:

  • What kind of software I want to help build
  • What communities I want to contribute to
  • How accessible and user-focused tools can be
  • How I can support future newcomers the way my GNOME mentors supported me

Open source makes career feel less like a ladder and more like a network.

Creating Opportunities for Others

Coming from a non-traditional path into tech, I’m especially aware of how powerful access and guidance can be. Programs like Outreachy don’t just create opportunities for individuals, they multiply opportunities through community.

As I grow, I want to contribute not only through code, but also through sharing knowledge, documenting processes, and encouraging others who feel unsure about entering open source.

Looking Ahead

I don’t have every step mapped out yet. But I now have something better: direction and momentum.

I want to continue contributing to open source, deepen my technical skills, and work on tools that people actually use. Outreachy and GNOME have shown me that opportunities often come from showing up consistently and contributing thoughtfully.

That’s the path I plan to keep following.

Andy Wingo: six thoughts on generating c

Hën, 09/02/2026 - 2:47md

So I work in compilers, which means that I write programs that translate programs to programs. Sometimes you will want to target a language at a higher level than just, like, assembler, and oftentimes C is that language. Generating C is less fraught than writing C by hand, as the generator can often avoid the undefined-behavior pitfalls that one has to be so careful about when writing C by hand. Still, I have found some patterns that help me get good results.

Today’s note is a quick summary of things that work for me. I won’t be so vain as to call them “best practices”, but they are my practices, and you can have them too if you like.

static inline functions enable data abstraction

When I learned C, in the early days of GStreamer (oh bless its heart it still has the same web page!), we used lots of preprocessor macros. Mostly we got the message over time that many macro uses should have been inline functions; macros are for token-pasting and generating names, not for data access or other implementation.

But what I did not appreciate until much later was that always-inline functions remove any possible performance penalty for data abstractions. For example, in Wastrel, I can describe a bounded range of WebAssembly memory via a memory struct, and an access to that memory in another struct:

struct memory { uintptr_t base; uint64_t size; }; struct access { uint32_t addr; uint32_t len; };

And then if I want a writable pointer to that memory, I can do so:

#define static_inline \ static inline __attribute__((always_inline)) static_inline void* write_ptr(struct memory m, struct access a) { BOUNDS_CHECK(m, a); char *base = __builtin_assume_aligned((char *) m.base_addr, 4096); return (void *) (base + a.addr); }

(Wastrel usually omits any code for BOUNDS_CHECK, and just relies on memory being mapped into a PROT_NONE region of an appropriate size. We use a macro there because if the bounds check fails and kills the process, it’s nice to be able to use __FILE__ and __LINE__.)

Regardless of whether explicit bounds checks are enabled, the static_inline attribute ensures that the abstraction cost is entirely burned away; and in the case where bounds checks are elided, we don’t need the size of the memory or the len of the access, so they won’t be allocated at all.

If write_ptr wasn’t static_inline, I would be a little worried that somewhere one of these struct values would get passed through memory. This is mostly a concern with functions that return structs by value; whereas in e.g. AArch64, returning a struct memory would use the same registers that a call to void (*)(struct memory) would use for the argument, the SYS-V x64 ABI only allocates two general-purpose registers to be used for return values. I would mostly prefer to not think about this flavor of bottleneck, and that is what static inline functions do for me.

avoid implicit integer conversions

C has an odd set of default integer conversions, for example promoting uint8_t to signed int, and also has weird boundary conditions for signed integers. When generating C, we should probably sidestep these rules and instead be explicit: define static inline u8_to_u32, s16_to_s32, etc conversion functions, and turn on -Wconversion.

Using static inline cast functions also allows the generated code to assert that operands are of a particular type. Ideally, you end up in a situation where all casts are in your helper functions, and no cast is in generated code.

wrap raw pointers and integers with intent

Whippet is a garbage collector written in C. A garbage collector cuts across all data abstractions: objects are sometimes viewed as absolute addresses, or ranges in a paged space, or offsets from the beginning of an aligned region, and so on. If you represent all of these concepts with size_t or uintptr_t or whatever, you’re going to have a bad time. So Whippet has struct gc_ref, struct gc_edge, and the like: single-member structs whose purpose it is to avoid confusion by partitioning sets of applicable operations. A gc_edge_address call will never apply to a struct gc_ref, and so on for other types and operations.

This is a great pattern for hand-written code, but it’s particularly powerful for compilers: you will often end up compiling a term of a known type or kind and you would like to avoid mistakes in the residualized C.

For example, when compiling WebAssembly, consider struct.set‘s operational semantics: the textual rendering states, “Assert: Due to validation, val is some ref.struct structaddr.” Wouldn’t it be nice if this assertion could translate to C? Well in this case it can: with single-inheritance subtyping (as WebAssembly has), you can make a forest of pointer subtypes:

typedef struct anyref { uintptr_t value; } anyref; typedef struct eqref { anyref p; } eqref; typedef struct i31ref { eqref p; } i31ref; typedef struct arrayref { eqref p; } arrayref; typedef struct structref { eqref p; } structref;

So for a (type $type_0 (struct (mut f64))), I might generate:

typedef struct type_0ref { structref p; } type_0ref;

Then if I generate a field setter for $type_0, I make it take a type_0ref:

static inline void type_0_set_field_0(type_0ref obj, double val) { ... }

In this way the types carry through from source to target language. There is a similar type forest for the actual object representations:

typedef struct wasm_any { uintptr_t type_tag; } wasm_any; typedef struct wasm_struct { wasm_any p; } wasm_struct; typedef struct type_0 { wasm_struct p; double field_0; } type_0; ...

And we generate little cast routines to go back and forth between type_0ref and type_0* as needed. There is no overhead because all routines are static inline, and we get pointer subtyping for free: if a struct.set $type_0 0 instruction is passed a subtype of $type_0, the compiler can generate an upcast that type-checks.

fear not memcpy

In WebAssembly, accesses to linear memory are not necessarily aligned, so we can’t just cast an address to (say) int32_t* and dereference. Instead we memcpy(&i32, addr, sizeof(int32_t)), and trust the compiler to just emit an unaligned load if it can (and it can). No need for more words here!

for ABI and tail calls, perform manual register allocation

So, GCC finally has __attribute__((musttail)): praise be. However, when compiling WebAssembly, it could be that you end up compiling a function with, like 30 arguments, or 30 return values; I don’t trust a C compiler to reliably shuffle between different stack argument needs at tail calls to or from such a function. It could even refuse to compile a file if it can’t meet its musttail obligations; not a good characteristic for a target language.

Really you would like it if all function parameters were allocated to registers. You can ensure this is the case if, say, you only pass the first n values in registers, and then pass the rest in global variables. You don’t need to pass them on a stack, because you can make the callee load them back to locals as part of the prologue.

What’s fun about this is that it also neatly enables multiple return values when compiling to C: simply go through the set of function types used in your program, allocate enough global variables of the right types to store all return values, and make a function epilogue store any “excess” return values—those beyond the first return value, if any—in global variables, and have callers reload those values right after calls.

what’s not to like

Generating C is a local optimum: you get the industrial-strength instruction selection and register allocation of GCC or Clang, you don’t have to implement many peephole-style optimizations, and you get to link to to possibly-inlinable C runtime routines. It’s hard to improve over this design point in a marginal way.

There are drawbacks, of course. As a Schemer, my largest source of annoyance is that I don’t have control of the stack: I don’t know how much stack a given function will need, nor can I extend the stack of my program in any reasonable way. I can’t iterate the stack to precisely enumerate embedded pointers (but perhaps that’s fine). I certainly can’t slice a stack to capture a delimited continuation.

The other major irritation is about side tables: one would like to be able to implement so-called zero-cost exceptions, but without support from the compiler and toolchain, it’s impossible.

And finally, source-level debugging is gnarly. You would like to be able to embed DWARF information corresponding to the code you residualize; I don’t know how to do that when generating C.

(Why not Rust, you ask? Of course you are asking that. For what it is worth, I have found that lifetimes are a frontend issue; if I had a source language with explicit lifetimes, I would consider producing Rust, as I could machine-check that the output has the same guarantees as the input. Likewise if I were using a Rust standard library. But if you are compiling from a language without fancy lifetimes, I don’t know what you would get from Rust: fewer implicit conversions, yes, but less mature tail call support, longer compile times... it’s a wash, I think.)

Oh well. Nothing is perfect, and it’s best to go into things with your eyes wide open. If you got down to here, I hope these notes help you in your generations. For me, once my generated C type-checked, it worked: very little debugging has been necessary. Hacking is not always like this, but I’ll take it when it comes. Until next time, happy hacking!