Possibly. On this topic (machine learning, differentiable programming, GPU and parallel computing) I'd recommend the following videos:
It's already been done by another group of researchers. I'm not sure if that was a factor in their choice of Swift.
"A Differentiable Programming System to Bridge Machine Learning and Scientific Computing"
From the abstract:
> We describe Zygote, a Differentiable Programming system that is able to take gradients of general program structures. We implement this system in the Julia programming language. Our system supports almost all language constructs (control flow, recursion, mutation, etc.) and compiles high-performance code without requiring any user intervention or refactoring to stage computations.
Just linking to this for those who haven't seen it.
What are you not sure of with Julia here? There are places in the ecosystem to not be sure of, but this isn't one. Julia has probably the most advanced mathematical optimization right now with JuMP (http://www.juliaopt.org/JuMP.jl/v0.19.2/) and some of the most advanced post-DL machine learning with the full language differentiable programming tools (Zygote, Tracker, ForwardDiff) which have showcased applications like quantum machine learning and neural stochastic differential equations (https://arxiv.org/abs/1907.07587). Some of it is still in flux, but in terms of ecosystem there's a lot of stuff there that you won't find in other languages.