For #4: the 'de
lifetime is the lifetime of the deserializer (and input data). It’s used to allow you to deserialize the data into a type without copying it, like deserializing a &'a [u8]
buffer into a &'a str
. It’s documented here if you want to read more.
Without context, there’s no reason to compare the performance of these. The compiler is complex enough that what you do in the loop and after the loop matters with regards to optimizations.
Do you have more context? What’s actually happening in the code?