commit | 4d8565496a5c153babdcb52b6cf7fa23b2dc2c05 | [log] [tgz] |
---|---|---|
author | Sam Lantinga <slouken@libsdl.org> | Sat Oct 01 13:35:36 2016 -0700 |
committer | Sam Lantinga <slouken@libsdl.org> | Sat Oct 01 13:35:36 2016 -0700 |
tree | 2a3c75614d2eba358aa27ab18099f3f8f3c1109c | |
parent | 60e88f2d9a2f77c853685c78f0d57778e8b8bced [diff] |
Fixed bug 3165 - define numbers don't match types in Swift C.W. Betts Swift is very strict with types, so much that those of different signedness/size must be cast. Most of the defines are imported as 32-bit signed integers, while the corresponding field in a struct is a 32-bit unsigned integer. Appending a "u" would cause the defined types to be imported as 32-bit unsigned integers.