# Steps for Ubuntu 24.04 (without CUDA support)
# Clone updated repository
git clone https://github.com/pedronahum/swift-apis.git
# Install older GCC if needed, e.g.:
sudo apt-get install g++-9
# Install Swift 6.2 DEV (required for autodifferentation)
# Required to compile tensorflow 2.4
export CC=/usr/bin/gcc-9
export CXX=/usr/bin/g++-9
rm -rf out
cmake -B out -G Ninja \
-S swift-apis \
-D CMAKE_BUILD_TYPE=Release \
-D ENABLE_PYTHON_SUPPORT=OFF
cmake --build out
As of today, I am facing a SIL issue related to Conv1D. Trying to understand if this is a compiler bug or human driven:
This particular compiler crash:
While computing canonical type for
type '@convention(thin) <Self where Self : Module> (τ_0_0) -> (τ_0_0.Input) -> (τ_0_0.Output, (τ_0_0.Input.TangentVector, τ_0_0.TangentVector) -> τ_0_0.Output.TangentVector)'
... This usually indicates the caller passed the wrong type or generic signature to getReducedType().
Swift's type-checker got stuck or confused while generating the differentiable function signature for a type conforming to Module
Get a taste of protocol-oriented differentiable programming.
This repository hosts Swift for TensorFlow's deep learning library, available both as a part of Swift for TensorFlow toolchains and as a Swift package.
This library is being automatically integrated in Swift for TensorFlow toolchains. You do not need to add this library as a Swift Package Manager dependency.
Open an empty Colaboratory now to try out Swift, TensorFlow, differentiable programming, and deep learning.
For detailed usage and troubleshooting, see Usage on the Swift for TensorFlow project homepage.
Simply import TensorFlow
to get the full power of TensorFlow.
import TensorFlow
let hiddenSize: Int = 10
struct Model: Layer {
var layer1 = Dense<Float>(inputSize: 4, outputSize: hiddenSize, activation: relu)
var layer2 = Dense<Float>(inputSize: hiddenSize, outputSize: hiddenSize, activation: relu)
var layer3 = Dense<Float>(inputSize: hiddenSize, outputSize: 3, activation: identity)
@differentiable
func callAsFunction(_ input: Tensor<Float>) -> Tensor<Float> {
return input.sequenced(through: layer1, layer2, layer3)
}
}
var classifier = Model()
let optimizer = SGD(for: classifier, learningRate: 0.02)
Context.local.learningPhase = .training
// Dummy data.
let x: Tensor<Float> = Tensor(randomNormal: [100, 4])
let y: Tensor<Int32> = Tensor(randomUniform: [100])
One way to define a training epoch is to use the
gradient(at:in:)
function.
for _ in 0..<1000 {
let 𝛁model = gradient(at: classifier) { classifier -> Tensor<Float> in
let ŷ = classifier(x)
let loss = softmaxCrossEntropy(logits: ŷ, labels: y)
print("Loss: \(loss)")
return loss
}
optimizer.update(&classifier, along: 𝛁model)
}
Another way is to make use of methods on Differentiable
or Layer
that
produce a backpropagation function. This allows you to compose your derivative
computation with great flexibility.
for _ in 0..<1000 {
let (ŷ, backprop) = classifier.appliedForBackpropagation(to: x)
let (loss, 𝛁ŷ) = valueWithGradient(at: ŷ) { ŷ in softmaxCrossEntropy(logits: ŷ, labels: y) }
print("Model output: \(ŷ), Loss: \(loss)")
let (𝛁model, _) = backprop(𝛁ŷ)
optimizer.update(&classifier, along: 𝛁model)
}
For more models, go to tensorflow/swift-models.
Documentation covering development can be found in the Developer Guide.
Please report bugs and feature requests using GitHub issues in this repository.
Discussion about Swift for TensorFlow happens on the [email protected] mailing list.
We welcome contributions: please read the Contributor Guide to get started. It's always a good idea to discuss your plans on the mailing list before making any major submissions.
In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.
The Swift for TensorFlow community is guided by our Code of Conduct, which we encourage everybody to read before participating.