Supporting Emoji in identifiers

It is indeed most unfortunate that legacy standards such as XID_Start and XID_Continue unicode properties prevent the use of perfectly cromulent and inclusive symbols such as emojies. Fortunately, at least we can get a partial workaround using even more ancient text standards, e.g. egyptian hieroglyphs. This is valid Rust code:

    let ๐“€€ = 1u32;
    let ๐“†ฃ = 2u32;
    let ๐“€ฌ = 3u32;
    let ๐“†ธ = vec![๐“€€, ๐“†ฃ, ๐“€ฌ];
    let ๐“€ซ = ๐“†ธ.into_boxed_slice();
    let ๐“€ญ = ๐“€ซ.len();

You just need to disable pesky uncommon_codepoints lints.

If you think that all the good stuff was in the past and they just don't make languages like the old ones anymore (just look at chinese, gosh! that's what I call a bloated language!), you can use the simple and natural cuniform from a more civilized age:

let ๐’†ฉ = 1u32;
let ๐’†ฏ = 2u32;
let ๐’€€ = 3u32;
let ๐’…„ = vec![๐’†ฉ, ๐’†ฏ, ๐’€€];
let ๐’†ณ = ๐’…„.into_boxed_slice();
let ๐’ซ = ๐’†ณ.len();    

Even a child can write that one! And they said COBOL was archaic ๐“ƒœ ๐“†ฆ

The learned ones may prefer something more exquisite, with subtle artistic expression in the stroke of every symbol:

let ๊ฆ‹ = 1u32;
let ๊ฆฏเทฏ = 2u32;
let ๊งฎ = 3u32;
let ๊ฉถ = vec![๊ฆ‹, ๊ฆฏเทฏ, ๊งฎ];
let ๊ฌค = ๊ฉถ.into_boxed_slice();
let ๊˜š = ๊ฌค.len();

Ah, Unicode! You're a real treasure tr๊™ฎve of beautiful artisanal code!

43 Likes