Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Erroneous multiple mutable borrows message #18803

Closed
bfops opened this issue Nov 9, 2014 · 8 comments
Closed

Erroneous multiple mutable borrows message #18803

bfops opened this issue Nov 9, 2014 · 8 comments
Labels
A-lifetimes Area: Lifetimes / regions

Comments

@bfops
Copy link
Contributor

bfops commented Nov 9, 2014

The following fails to compile:

pub struct GLContext;

pub struct ShaderHandle<'a>;

impl<'a> ShaderHandle<'a> {
  pub fn new<'b: 'a>(
    _gl: &'b mut GLContext,
  ) -> ShaderHandle<'a> {
    ShaderHandle
  }
}

pub struct Shaders<'a> {
  pub shaders: Vec<ShaderHandle<'a>>,
}

impl<'a> Shaders<'a> {
  pub fn new<'b: 'a> (
    gl: &'b mut GLContext,
  ) -> Shaders<'a> {
    let shaders = vec![ShaderHandle::new(gl)];

    Shaders {
      shaders: shaders,
    }   
  }
}

pub fn main() {
  let mut gl = GLContext;

  let x = Shaders::new(&mut gl);
  let y = Shaders::new(&mut gl);
}

If I inline the definition of Shaders::new, it compiles. If I remove the Vec and just store a single ShaderHandle in each Shaders, it compiles. Is this a bug, or some subtlety in borrow semantics?

@huonw
Copy link
Member

huonw commented Nov 9, 2014

The complier output is:

18803.rs:33:29: 33:31 error: cannot borrow `gl` as mutable more than once at a time
18803.rs:33   let y = Shaders::new(&mut gl);
                                        ^~
18803.rs:32:29: 32:31 note: previous borrow of `gl` occurs here; the mutable borrow prevents subsequent moves, borrows, or modification of `gl` until the borrow ends
18803.rs:32   let x = Shaders::new(&mut gl);
                                        ^~
18803.rs:34:2: 34:2 note: previous borrow ends here
18803.rs:29 pub fn main() {
18803.rs:30   let mut gl = GLContext;
18803.rs:31 
18803.rs:32   let x = Shaders::new(&mut gl);
18803.rs:33   let y = Shaders::new(&mut gl);
18803.rs:34 }
            ^
error: aborting due to previous error

I think this is to do with how the compiler handles unused lifetimes. You probably want to be storing it with a ContravariantLifetime marker, which will probably cause all variations fail to compile with the above message.

cc @nikomatsakis for confirmation.

@bfops
Copy link
Contributor Author

bfops commented Nov 9, 2014

Thanks! I removed <'b:'a> and added ContravariantLifetime<'a> to the definition of Shaders, and now compilation fails regardless of whether or not I'm using a Vec.. But I'm still confused why that happens - gl isn't actually being used anywhere, it's just being threaded through for the sake of the lifetime constraint, and even if it were being used, it's not necessarily being borrowed.

@bfops
Copy link
Contributor Author

bfops commented Nov 9, 2014

And follow-up question: if this is the wrong way, what's the right one? I'm trying to make sure that an OpenGL context exists when I do things like create shaders, but the design above doesn't work, and neither does:

#[test]
pub fn foo() {
  let mut gl = GLContext::new();

  let x = gl.test_create_program();
  let y = gl.test_create_program();
}

Edit: One option is to separate the creation of the structs from their initialization, so that the structs can always be created with an immutable context (e.g. having no calls to Bind* functions), and the initialization logic can potentially mutate the context. That seems a little restrictive though.

@nikomatsakis
Copy link
Contributor

@bfops if you want to have a shared context that everyone references, I suggest you use a shared reference (& not &mut) and then use Cell or RefCell for mutability where necessary.

(cc #3598 which would make the "unused lifetime parameter" an error)

@bfops
Copy link
Contributor Author

bfops commented Nov 10, 2014

@nikomatsakis I do specifically want the compiler giving me sane checks for mutable usage of the OpenGL context, though (and also to not incur runtime overhead). My solution right now is to have a GLContextExistence struct that I use for the lifetime bounds, and a GLContext struct that is mutated or not in certain calls (e.g. gl.GetError takes an immutable reference, but gl.UseShader takes a mutable one), but I still don't understand why it's even necessary - the GLContext is never being - its lifetime is only used for lifetime bounding.

@steveklabnik
Copy link
Member

So is this a bug or not?

@steveklabnik steveklabnik added the A-lifetimes Area: Lifetimes / regions label Jan 27, 2015
@nikomatsakis
Copy link
Contributor

Ah, I meant to reply to @bfops, sorry about that. I think this it not a bug. Basically, this is how the type system works. That said, there might be a way to express what @bfops wanted with some cleverness. Or else it is potential fodder for future type-system extensions. @bfops maybe you can ping me on IRC and we can go over the example? Right now this is all kind of "out of cache".

@bfops
Copy link
Contributor Author

bfops commented Jan 30, 2015

SGTM thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-lifetimes Area: Lifetimes / regions
Projects
None yet
Development

No branches or pull requests

4 participants