I noticed two errors in the c/libffi_arm64/build_libffi.bat file, impacting the ffi.lib that is checked in, while running tests on an actual Windows ARM64 device. These errors make it impossible to build, so once you start getting ready to do those releases, you'll hit it anyway. Hopefully I can save you some research time
Line 49 of the batch file builds a static library but sets -DFFI_BUILDING_DLL. For a static lib, this needs to be -DFFI_BUILDING, or else some type codes are not available.
Line 53 reads from %0, but we may have already called shift and invalidated that value. This line needs to move to before Check_Opts.
After these changes, the ffi.lib file needs to be rebuild. I could provide the file for you, but since you've got no reason to trust me I figured I'd just leave it to the maintainers.
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Child items
0
Show closed items
No child items are currently assigned. Use child items to break down this issue into smaller parts.
Linked items
0
Link issues together to show that they're related.
Learn more.
@zooba I also looked a little sideways at the -DFFI_BUILDING_DLL thing a few times when poking at this last time, but it passed the tests, so I left it alone.
I've rebuilt the static lib with -DFFI_BUILDING (after much pain getting a working Win/arm64 libffi build env installed again- maybe I'll write it down this time ;) ) and it passes the tests as well- a cursory examination of the symbol tables between the two versions looks more-or-less the same. I'm wondering if the real problem is that we need to have a surrogate no-op reference of some sort to all the expected FFI stuff that we need to be available from CFFI's extension, since the linker may be dropping symbols and chunks that aren't directly referenced by our extension (which wouldn't be a problem with the DLL version).
Can you share a standalone reproducer that's referencing missing symbols with the existing version of the static lib? I verified again that the entire test suite passes on real Win/arm64 hardware, so if we've got a shortcoming in the tests, I'd like to get that corrected.
The .bat script has all sorts of little problems around input and looping, but since the only input currently used is whether or not to install cygwin, I'm not seeing an actual problem at a glance. Ideally we'd just move the whole static lib build over to use the .vcxproj arm64 build stuff in libffi and be able to ditch the need for cygwin entirely, but I wasn't able to get that working quickly...
(On an x64 machine, where python refers to an x64 Python install and $(PYTHON_LIBS) refers to a directory of ARM64 import libs taken from Nuget.)
I'm wondering if the real problem is that we need to have a surrogate no-op reference of some sort to all the expected FFI stuff that we need to be available from CFFI's extension, since the linker may be dropping symbols and chunks that aren't directly referenced by our extension.
I'd have to look into the code again, but going by names alone I'd be surprised if you were meant to choose ..._DLL for a non-DLL build.
You may be right about the blog post, but I think we're already referring to them and it's failing to resolve? It may be that the names are being exported differently depending on the dllexport marking, and if they were marked dllimport on the CFFI side it would find them. But since we know it's going to be statically linked, should probably just mark them as such (by omitting the marks) on both sides of the equation.
since the only input currently used is whether or not to install cygwin, I'm not seeing an actual problem at a glance
When that option is used, the shift command is run which modifies %0, and %0 is used later on to find the location of the batch file itself. So there's nothing particularly wrong with the batch file, just a non-obvious variable modification.
Ideally we'd just move the whole static lib build over to use the .vcxproj arm64 build stuff in libffi and be able to ditch the need for cygwin entirely, but I wasn't able to get that working quickly...
I haven't even seen this - we're using a separate build for CPython using Cygwin that I run once and keep the binaries around. (After having a quick look) Well, certainly seems like a better option than Cygwin :) I guess the hard part is platform detection, and once you know which platform it'll be, the rest is straightforward. Still, best to let the libffi project control it so they can make changes.
Also just noticed that libffi's MSVC build project sets FFI_BUILDING_DLL and then builds a static lib... but they also set Link settings (which aren't used) instead of Lib, so maybe their builds work just enough to pass a smoke test?
My reproducer was just building a wheel from source
I need sources for something I can build that will reproduce the failure you're seeing- I'm not a direct user of CFFI, and you're basically telling me nothing will build. The test suite happily builds thousands of things and runs them successfully, so can you point me at sources for something that actually fails?
After playing around a little more trying to get that libffi vcxproj to work, I suspect that it may have worked at one point, but doesn't line up with the current sources. It's shortcutting the lack of autoconf et al by just including static output from $some_3_year_old_version_of_the_source
... or are you trying to build a wheel of CFFI itself? It's not clear to me where you're hitting a problem. All the tests I'm doing are running against a built wheel of CFFI, so the wheel build of CFFI clearly "works on my machine" anyway
Sorry, yes, I'm building CFFI itself on an x64 machine targeting Windows ARM64 (i.e. a win_arm64 wheel, which currently only works with the 3.11 beta or if you build an earlier version of Python specifically for ARM64). It's a build scenario that you're almost certainly not covering right now (since practically nobody has Windows ARM64 hardware yet) - I'm just jumping ahead to make sure you'll be able to light it up easily.
So basically, I'm pulling the sources directly from default, patching this line (because it isn't detecting a cross-compile - I suggest os.getenv("VSCMD_ARG_TGT_ARCH") == "arm64" or platform.machine() == "ARM64"), then running setup.py build_ext with the env variables and extra libs I mentioned above.
(Aside: cross-compiling with Python on Windows right now is terrible, sorry. Most projects have it real easy, but some of the lower-level ones carry the weight. The worst part is getting the right python311.lib, but they're up on Nuget, so downloading can be scripted, and then pass them with build_ext -L <libs>. So far, hacking it into the projects that need it has been the only way - even if we put it in setuptools, most projects that need it would be bypassing it anyway to work around other issues ;) )
All that said, it looks like there may have been a change since 1.15.0 that has improved things, and it appears to be related to the FFI_BUILDING definition in setup.py. But I'm running in 3 different configurations and haven't got it all consistent yet, so I'll post again if/when I can confirm.
I'm mainly here to deal with the packaging/test/CI stuff- directly supporting and verifying cross-compilation of CFFI wheels is pretty hard for me to justify spending time on. I already spend way too much time supporting Windows and Mac builds on this and PyYAML in addition to all the wacky Linux arches in the name of giving back to the community (since my day job is maintaining Ansible).
I also just realized you're the one that's maintaining the nuget.org Python arm64 builds- thanks a million for that, saved me a ton of pain building my own once someone pointed me at those.
We can definitely bring in the envvar change you mentioned to check for an override arch, but I'm curious what other libs you're pulling in or other customization you're doing- at least on real hardware python -m build -w . gets the job done for me from a vanilla command prompt.
The "other libs" are the import libraries for python3x.dll. Normally they're included in the Python install directory, but when you're running x64 Python you get x64 import libs, not the ARM64 ones. So it's just pulling down the ARM64 Nuget and grabbing the libs out of there to use them instead.
And the build envvars work from a vanilla prompt - only the manual suffix is the tricky one, because you need to know what it'll be. I'm hoping to get most of this logic into cibuildwheel, but since it's all pretty setuptools specific, it's harder to justify. It's just what is needed right now.
I haven't tried anything with the build package yet, but since there's no way to request a cross-compile via PEP 517 specifications, it's likely there'll still need to be environment variables.
But basically, I'm not expecting anyone to spend time verifying ARM64 wheels until there's easily available VMs (and hopefully free CI via GitHub or Azure Pipelines, and any other service that wants to offer it). When that's available, it ought to be easy enough to add it into a CI configuration, but it'll be even easier if most of the kinks have already been worked out.
Ah, that makes sense then. So I'm guessing if it's still having problems even on the 1.15.1 bits, there's some other sneaky arch-specific thing that's slipping through on the cross-compile, because it definitely works fine when doing the normal build on native arm64 hardware.
I'm not expecting anyone to spend time verifying ARM64 wheels until there's easily available VMs
Heh, I already had to build my own custom setup for securely hosting one-shot Apple Silicon Mac VMs for CI on this and PyYAML a couple years ago (relying on Rosetta2 to run the Mac Intel GHA runner and just wrapping all the build steps in arch -arm64). Turned out that adding Windows/arm64 VM support on top of that wasn't too hard either, now that Win11 supports running x64 binaries natively (so I did the same trick with the existing Windows Intel GHA runner).
If you're happy to run builds on your own hardware, then you should be good to go already :) Would be nice to start seeing releases for 3.11 beta, as there are a lot of projects that can't move forward themselves without your release.
Yep, that was why I was worried about this one, because it was like "uh, it seems to have been working fine, so what's the deal here?".
Would be nice to start seeing releases for 3.11 beta
That's the plan- I'm torn between just pushing py3.11 wheels out with 1.15.1 vs having a long-ish 1.15.2 beta and trying to coincide its release with ~3.11.0rc1. The former is probably fine and would get more people exposed to it without having to opt-in to prereleases, so long as a future beta doesn't have an ABI breakage (which has happened nonzero times). I'm open to opinions if you've got them ;)
I think as long as 3.11 is in prerelease it's implied that any packages targeting it specifically are also in prerelease, so I'd personally be fine with it. It's certainly better than nothing.