Decimal: different access level for fields and initializer on MacOS and Linux
According to documentation, there is an initializer for Decimal is exposed to public API in Foundation: https://developer.apple.com/documentation/foundation/decimal/1407961-init
I see that on MacOS I have access to the following fields and initializer:
public struct Decimal : @unchecked Sendable {
public init()
public init(_exponent: Int32, _length: UInt32, _isNegative: UInt32, _isCompact: UInt32, _reserved: UInt32, _mantissa: (UInt16, UInt16, UInt16, UInt16, UInt16, UInt16, UInt16, UInt16))
public var _exponent: Int32
public var _length: UInt32
public var _isNegative: UInt32
public var _isCompact: UInt32
public var _reserved: UInt32
public var _mantissa: (UInt16, UInt16, UInt16, UInt16, UInt16, UInt16, UInt16, UInt16)
}
and my test code compiles and runs:
extension Decimal {
var details: String {
return
"""
Parts:
_exponent = \(self._exponent)
_length = \(self._length)
_isNegative = \(self._isNegative)
_isCompact = \(self._isCompact)
_reserved = \(self._reserved)
_mantissa = \(self._mantissa)
significand = \(self.significand)
"""
}
}
On Linux I have the following error:
DecimalTest.swift:30:32: error: '_exponent' is inaccessible due to 'internal' protection level
28 | """
29 | Parts:
30 | _exponent = \(self._exponent)
| `- error: '_exponent' is inaccessible due to 'internal' protection level
31 | _length = \(self._length)
32 | _isNegative = \(self._isNegative)
The same if I use initializer:
extension Decimal: DetailsProviding {
init() {
self.init(_exponent: 0, _length: 0, _isNegative: 0, _isCompact: 0, _reserved: 0, _mantissa: (0, 0, 0, 0, 0, 0, 0, 0))
}
}
error for Linux is the following:
DecimalTest.swift:26:18: error: extra arguments at positions #2, #3, #4, #5, #6 in call
24 | extension Decimal: DetailsProviding {
25 | init() {
26 | self.init(_exponent: 0, _length: 0, _isNegative: 0, _isCompact: 0, _reserved: 0, _mantissa: (0, 0, 0, 0, 0, 0, 0, 0))
| `- error: extra arguments at positions #2, #3, #4, #5, #6 in call
27 | }
28 |
I use Xcode 16.0 beta 6 on MacOS:
$ swift --version
swift-driver version: 1.115 Apple Swift version 6.0 (swiftlang-6.0.0.9.10 clang-1600.0.26.2)
Target: arm64-apple-macosx14.0
And swift 6.0 release on Linux:
Swift version 6.0 (swift-6.0-RELEASE)
Target: aarch64-unknown-linux-gnu
I am seeing the same issue.
Are you able to use other initializers that are available for both platforms in your code?
Ubuntu 24.04 - Swift 6.0.1 : Here are all the initializers shown in Foundation.Decimal (at file : FoundationEssentials.swiftinterface) :
@available(macOS 10.10, iOS 8.0, watchOS 2.0, tvOS 9.0, *)
public struct Decimal : Sendable {
public typealias Mantissa = (UInt16, UInt16, UInt16, UInt16, UInt16, UInt16, UInt16, UInt16)
public init(mantissa: UInt64, exponent: Int16, isNegative: Bool)
public init()
}
@itingliu we want to encode Decimal to flatbuffers efficiently as fixed size structure thus we would like to have an interface that give us its internals.
For Mac we are able to access these fields and encode/decode them as plain types. However, these fields are unavailable for Linux and we have to implement dummy Encoder that just take those fields from encoding method and dummy Decoder where we feed those fields back. It would be nice to have some structure of fixed-sized types to be able to encode/decode Decimal without having such workaround.
We understand that encoding raw data may implies that we need some versioning in case this type internals will ever change or will be implemented in stdlib. However, we would like to have the highest possible performance for Decimal as it is fundamental type for us.
Just a kind reminder. Could you suggest if the same interface as for macOS might be supported, please?
Shouldn't the access and initialisers be platform-equal? Ping @itingliu @parkera
(we still have this issue with 6.1)
Friendly ping @stephentyrone as discussed last week at the conference. 🙂