"struct(gc)" or "struct(ref)" and subtypes

I've been in trouble when using tracing (or mark-n-sweep) garbage collector libraries, mainly because I tried to inherit a trait. In practice previously, with a symbol table for my own language's compiler, I got a suggestion in using Weak<RefCell<T>> (and keep a strong Rc refererence in a symbol pool), however this way I can't have two or more references to the same resource in the same scope. Another thing I did before this is use the gc crate, however Gc cannot be used as self parameter.

Another possibility is to use unsafe code, use struct Symbol instead of trait Symbol and a [repr(C)] attribute, but even with unsafe the code gets weird.

Possible solutions

a) Proper reference types (struct(ref)) and struct(ref) subtypes

There's a Rust dialect that supports a struct(gc) annotation to indicate the struct is by-reference. And WebAssembly supports reference types. So why not do the same and support inheritance together, which is a benefit for compiler developers, among other purposes?

Instead of struct(gc), it could be struct(ref), since ref is an existing keyword and gc would need to be a context keyword.

How it'd look like:

struct(ref) Symbol {
}
impl Symbol {
    fn method_that_takes_self(&self) {
    }
}

struct(ref) StringConstant: Symbol {
}
impl StringConstant {
    #[override]
    fn method_that_takes_self(&self) {
    }
}

// covariant conversion
let s: &Symbol = StringConstant { ... };

// contravariant conversion
let c = s as StringConstant;
let c = s.try_into::<StringConstant>();

It'd implement:

  • std::clone::Clone (no way to override)
  • std::marker::Copy
  • std::convert::TryInto
  • Field inheritance
  • Method inheritance and overriding
  • Instance-of test (do know what syntax or method could fit)

b) Smart pointer as self parameter

There's also another way instead of the feature a): supporting a Gc-like smart pointer as a self parameter (and thus you use a trait). But I believe proper reference types are more efficient, because for a trait to be used, several default methods are reimplemented for different types (like is_string_constant to determine whether a Symbol is a StringConstant object), for example:

trait Symbol {
    fn is_string_constant(&self) -> bool { false }

    fn method_that_takes_self(self: &Gc<Self>) {
    }
}

struct StringConstant {
}

impl Symbol for StringConstant {
    fn is_string_constant(&self) -> bool { true }

    fn method_that_takes_self(self: &Gc<Self>) {
    }
}

It would help my understanding of your explanations if you elaborated on these two points:

In both cases the description is so concise / high-level that it's impossible to understand where exactly the limitation you're describing came from or even what exactly the limitation was that you came across. Naming concrete types in the explanation or perhaps even some code examples would be very enlightening.

1 Like

I edited the second quote because I don't remind the issue. Anyway, here is the origin of the two (I was using gc earlier):

I updated the last part of my post also to favour reference types versus a smart pointer (like Gc) that could be a self parameter.

Full-blown garbage collection, inheritance, and implicit reference types are simply not Rust. If you have a use case that predominantly requires such a paradigm, or if you simply personally prefer to write in a traditional OO style, then Rust might not be the language you are looking for — at this point it sounds like you want to write in C# or Java instead.

Adding 3 such massive changes to Rust's type system would likely lead to a massive ecosystem split at best, and be borderline impossible to consolidate with the existing model of ownership and borrowing at worst. It's just not realistic.

5 Likes

This is a restriction we would like to lift, fwiw: see the arbitrary_self_type feature.

For downcasting, you already can use Any (and take your pick of "make your own any" crates to add downcasting seamlessly onto custom trait objects).

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.