Translator: NiZerin

Original link: nullderef.com/blog/rust-f…


Rust makes expressing conditional compilation very easy, especially because of its “nature.” They integrate well into the language and are very easy to use. But one thing I’ve learned from maintaining Rspotify, the library of Spotify apis, is that people shouldn’t get hooked on them. Conditional compilation should be used when conditional compilation is the only way to solve a problem, for a number of reasons that I’ll explain in this article.

This may be obvious to some people, but it wasn’t so clear to me when I started using Rust. Even if you already know, this can be an interesting reminder; Perhaps you forgot about it in your latest project and added an unnecessary feature.

Conditional compilation is nothing new. C and C++ have been doing this for a long time, for one thing. So the same thing can be applied to these cases. However, in my experience, it is much easier to use conditional compilation in Rust, which means it is also more likely to be abused.

The problem

I ran into this dilemma when deciding how to configure caching tokens in Rspotify. The library enables you to persist authentication tokens through JSON files. This way, when the program starts again, the token from the previous session can be reused without having to perform the full authentication process again — that is, until the token expires.

Initially, this will be called cached_token. I didn’t think too much about it. If you don’t need it, why do you need code to save and read the token file? The easiest way is to use a feature you can use in your Cargo. Toml.

However, I later needed another very similar function, refreshing_token. When optionally enabled, the client automatically refreshes expired tokens. As this pattern shows up more and more in libraries, I want to make sure it’s optimally designed. Digging deeper, I began to see many of the inconveniences of functionality:

They are inflexible: you cannot have a client with caching tokens and another client without them in the same program. It’s a library-wide thing, so you either enable them or you don’t. Obviously, they can’t be configured at run time either; The user may want to choose what behavior to follow while the program is running.

They are ugly: writing #[CFG (feature = “cached_token”)] is weirder and wordier than normal if cached_token.

They are messy: the functionality in the code base is hard to manage. You can easily find yourself in Rust, the equivalent of #ifdef hell.

They are hard to document and test: Rust provides no way to expose library functionality. All you can do is list them manually on the document’s home page. Testing is also harder, because you have to figure out which combination of features to use to cover the entire code base and apply them when you want to run tests.

Just make sure the binaries don’t contain code you don’t need, all of which is considered important. But how real is it? Really? How important is it?

alternative

It turns out that one of the simplest optimizations a compiler can implement is constant propagation. This, combined with the removal of dead code, can produce exactly the same effect as features, but in a more natural way. In addition to adding functionality to configure the behavior of the program, you can do the same with the Config structure. If it’s just an option to configure, you might not even need the structure, but then it’s future-oriented. Such as:

#[derive(Default)]
struct Config {
    cached_token: bool,
    refreshing_token: bool,}Copy the code

You can then modify your client to selectively adopt the Config structure:

struct Client {
    config: Config
}

impl Client {
    /// Uses the default configuration for the initialization
    fn new() -> Client {
        Client {
            config: Config::default(),
        }
    }

    /// Uses a custom configuration for the initialization
    fn with_config(config: Config) -> Client {
        Client {
            config,
        }
    }

    fn do_request(&self) {
        if self.config.cached_token {
            println!("Saving cache token to the file!");
        }
        // The previous block used to be equivalent to:
        //
        // #[cfg(feature = "cached_token")]
        / / {
        // println! ("Saving cache token to the file!" );
        // }

        if self.config.refreshing_token {
            println!("Refreshing token!");
        }

        println!("Performing request!"); }}Copy the code

Finally, users can customize the client they want in their code in a very natural way:

fn main() {
    // Option A
    let client = Client::new();

    // Option B
    let config = Config {
        cached_token: true.Default::default()
    };
    let client = Client::with_config(config);
}
Copy the code

Prove that you end up with the same code

Thanks to the excellent Compiler Explorer, we can use the following code snippet to ensure that the compilation is as expected:

It seems that starting with Rust 1.53, for opt-level values greater than or equal to 2, the code for disabled functionality does not even appear in the assembly (easy to see by looking at the string at the end). Cargo Build –release is configured with opt-level 3, so it should not be a problem for production binaries.

We’re not even using const! I wonder what happens in this case. Use this slightly modified fragment:

Well. We actually got the same result. The generated assembly is exactly the same, with optional code only from opt-level=2.

The problem is that const only means that its value can, but not must, be inlined. Nothing else. So we still have no guarantee that inlining is not enough to simplify the code inside the function.

So, for what I’m investigating, it’s best not to worry about it and use variables instead of const. It looks better and gets the same results.

No matter how

Even if the previous optimizations weren’t implemented, would the optional code really do any harm to the final binary? Are we over-designing solutions as usual? The fact is that the optional code for caching/refreshing tokens isn’t even that bloated.

It depends, of course, but in my opinion, binary bloating is not a big problem for higher-level binaries. Rust has statically embedded its standard library, runtime, and extensive debugging information in each binary, with a total size of about 3MB. The only overhead you might get at run time is branching.

conclusion

Sometimes you just need to use conditional compilation; There’s no way around it. The functionality is useful in cases where you may be dealing with platform-specific code or want to reduce the number of Crate dependencies.

But that’s not the case with Rspotify. Conditional compilation is definitely not the way to go. When you’re about to introduce a new feature to your Crate, think to yourself, “Do I really need conditional compilation?” .

Neither cached_token nor refreshing_token follows the usual reasoning as to why functionality might be added. They do not allow access to new features/modules. They do nothing to get rid of optional dependencies. And they’re certainly not platform-specific features. They simply configure the library’s behavior.

To avoid this, perhaps the functionality might be named differently? Enabling support for caching tokens does sound like a “feature,” and operating system-specific code doesn’t seem like a real feature. Sometimes I find it confusing, and Google agrees with me on this one. Looking for information related to Rust features can return something completely unrelated, because the result has the word “feature,” but means “an attribute or aspect of a program.” Kind of like you have to Google “Golang X” instead of “Go X,” otherwise it doesn’t make sense. But in any case, my opinion is too late.