-
Notifications
You must be signed in to change notification settings - Fork 50
[Migrated] Array init with 0i32 uses 0u32 instead. #85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
What is the state of this issue? I am able to replicate it on latest |
Oh, looks like that PR got lost in the old repo to new repo shuffle! |
LegNeato
pushed a commit
to LegNeato/rust-gpu
that referenced
this issue
Apr 9, 2025
Fixes Rust-GPU#85. Fixes EmbarkStudios/rust-gpu#1061 Thanks to @LegNeato
@149segolte can you try with the PR / branch I just put up? |
github-merge-queue bot
pushed a commit
that referenced
this issue
Apr 9, 2025
Fixes #85. Fixes EmbarkStudios/rust-gpu#1061 Thanks to @LegNeato
@charles-r-earp thanks for the original bug report and repros 🍻 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Issue automatically imported from old repo: EmbarkStudios/rust-gpu#1061
Old labels: t: bug
Originally creatd by charles-r-earp on 2023-05-21T05:56:49Z
Initializing arrays seems to replace 0i32 with 0u32 when using [0i32; N].
Example (see https://github.com/charles-r-earp/rust-gpu/tree/array-init for compiletests).
A 0u32 is stored to
o
instead of the expected 0i32, which fails to compile.What works:
<[T; N]>::default()
[0i32, 0i32, ..]
[1i32; 1]
The text was updated successfully, but these errors were encountered: