Hello, I’m fairly new to Rust and came across this. Can someone explain to me how the following example is able to infer the constant value from the array length passed in? At this point, inferred type generation for function calls are a bit hand wavy, to me, does anyone know of a resource that breaks down all the different ways they can be used (for instance in this example I hadn’t seen them used for consts) and what their limitations are in Rust? I often run across a ‘this type can not be inferred’ error without really knowing why not and just throw in the type to make it go away.

Any other examples people could point me to would be appreciated as well.

Thanks!

#[derive(Debug)]
struct Buffer<T, const LENGTH: usize> {
    buf: [T; LENGTH],
}

impl<T, const LENGTH: usize> From<[T; LENGTH]> for Buffer<T, LENGTH> {
    fn from(buf: [T; LENGTH]) -> Self {
        Buffer { buf }
    }
}

fn main() {
    let buf = Buffer::from([0, 1, 2, 3,5]);
    dbg!(&buf);
}

Edit: for some reason, the code markdown is hiding things inside of the <>'s (at least on my lemmy viewing client)

  • @[email protected]OP
    link
    fedilink
    English
    19 days ago

    I gather from your explanation, that in order to tell before hand whether or not a type will be inferred, you really need to examine the code and see how things are being handled, and optimized out. (And even then you still may not know) Interesting, thanks.

    • @[email protected]
      link
      fedilink
      19 days ago

      You don’t need to know at all what optimizations will happen. I said that as an example of a thing that you know in compile time but not in run time.

      To tell or not whether a type will be inferred is determined by you. If you tell the compiler the type, it will never be inferred. If you don’t tell the compiler the type, it will try to infer it. If it tries to infer the type but it fails, it will throw a compiler error and it won’t finish building the binary.

      The compiler will only successfully infer a type if it has enough information at compile time to know with certainty what type it is. Of course, the compiler is not perfect, so it is possible in complex situations for it to fail even though it theoretically could.

      Examples where inferring will succeed:

      
      fn copy<T>(in: T) -> T {
          return in;
      }
      
      fn main() {
          let a = 47; //here a is of type i32, this was not inferred, it's just the default type of integer literals
          let b = copy(a); // here the compiler knows that a is i32, therefore it should call copy<i32>. Due to the type signature of copy<i32>, the type of b is inferred to be i32
      
          let c: u16 = 25; // here instead of the default, we manually specify that the type of c is u16
          let d = copy(c); // this is the same as b, but instead of calling copy<i32>, copy<u16> is called. Therefore d is inferred to be u16
      
          let e = 60; // at first, this looks like a, and it should be  the default of i32
          let f: i64 = copy(e); // here, since f is specified to be i64, copy<i64> is called. Therefore e instead of being the default of i32, it is overridden since inference has preference over the default. e is inferred to be i64.
      }
      

      Examples where inference will fail

      
      trait Default {
         fn default() -> Self
      }
      
      impl Default for i32 {
          fn default() -> i32 { return 0 }
      }
      
      impl Default for i8 {
          fn default() -> i8 { return 0}
      }
      
      fn main() {
          let a: i32 = 8;
          let b = copy(a);
          let c: u8 = copy(b);
          // What type should be inferred to? If it calls copy<i32> because a is i32, then it can't call copy<u8> later to initialize c. And if it calls copy<u8> instead, it can't receive a as an argument since a is i32. Results in compiler error
      
          let d = Default::default();
          // What type is d? both i32 and i8 implement the Default trait, each with its own return type.
          // let d: i32 = Default::default(); would compile correctly.
      }
      

      These situations might be obvious, but inference works as a chain, sometimes hundreds of types are inferred in a single function call. So you should know the basics to diagnose these kinds of problems.