Comments Locked

30 Comments

Back to Article

  • Gigaplex - Friday, October 2, 2015 - link

    This is definitely a boon for app startup times, but isn't likely to make much of a difference in performance of compute heavy codepaths. It is extremely difficult to write C# code that is CPU cache friendly for example, the whole concept of everything being references makes it near impossible for the prefetcher to work. No amount of native compilation will fix that, the data structures themselves need to change.

    I hope this comes to desktop applications soon, as the startup times on some of our large in-house C# projects at work are abysmal.
  • valeriob - Friday, October 2, 2015 - link

    on the contrary, c# value types enables exactly that
  • Gigaplex - Friday, October 2, 2015 - link

    C# value types often don't help much when you're in a loop because most of the container classes break cache locality. Then there's the boxing/unboxing overhead undoing some of your optimisations. Plus, if you strictly stick to value types, you lose a lot of the value of C# so may as well use something better suited for that task.
  • valeriob - Saturday, October 3, 2015 - link

    Good point, but we're talking about general purpose application, and it's good enough, if you need more it's easy to link a c++ component for the heavy lifting :D
  • HollyDOL - Tuesday, October 6, 2015 - link

    With .NET 4.6 you can avoid this hit if you use SIMD/AVX2 compatibile looping with RyuJIT. Ofc it's not always possible or always easy but there is a way.
  • FieryUP - Friday, October 2, 2015 - link

    I don't think CPU cache friendliness is all that important with C#. For performance code you might as well pick C++ over C#, or move the most performance critical code section to C++ while retaining the UX and general majority of the code in C#. VS2015 is very great at that, I do have a solution with 3 app projects for 3 different platforms (WP8.1, Win8.1 Store Apps, Win10 UWP), and 2 additional projects that are shared by the 3 app projects. One of the shared projects is C#, the other shared project is C++. It all works very smooth, no matter what app platform or target ISA I choose.
  • Cogman - Friday, October 2, 2015 - link

    Not every type in C# is a reference type. There are also value types that you can define, although they have different rules from reference types. Further, you are allowed to do stack allocations in C#.
  • gamoniac - Monday, October 5, 2015 - link

    For compute-intensive tasks, C# does supports the use of unsafe block, which allows you to access unmanaged heap and use pointer and address of operators. While I have not personally benchmarked it, It is said to be faster that imvoking C functions directly using DLLImport() because it does not need to switch from the .Net framework environment.
  • danbob999 - Friday, October 2, 2015 - link

    "Microsoft will be doing the compilation themselves once the app is uploaded to the store"

    I still don't understand why Google isn't doing the same. Remove the compiler from the devices, and compile in the cloud. There are only 2-3 targets anyways (ARM32, ARM64, x86-64)
  • Flunk - Friday, October 2, 2015 - link

    It's a whole different thing. High-level Android apps are already platform independent because they're written in Java. The whole point of the NDK is platform-specific native code.
  • JohnGalt1717 - Friday, October 2, 2015 - link

    No it isn't. All of the Java apps could benefit from exactly the same process because Java works almost identically to .NET in that it's compiled to IL which is executed by the JIT. By fully compiling these there would be MASSIVE improvements in speed and size of apps. (and Xamarin etc. wouldn't take a penalty any longer either.)
  • FieryUP - Friday, October 2, 2015 - link

    I agree, although there are a lot more targets than that. ARMv7, ARMv7e, 32-bit ARMv8, 64-bit ARMv8, x86 (32-bit), x64, MIPS, MIPS64 at a bare minimum.
  • Morawka - Friday, October 2, 2015 - link

    i dont think you target 32bit armv8 but i might be wrong, i thought the ARMv8 64 had all the backwards compatibility.
  • da_asmodai - Wednesday, October 7, 2015 - link

    Because Microsoft's method means the platform is limited to only the hardware the Microsoft Store supports generating native code for. Google's method of having the user compile at install means the platform can support any hardware, google only stores the platform independ code. If an OEM makes a new hardware platform, for example Samsung makes a new Exynos chip, then Samsung grabs the AOSP code and modifies it to meet their custom hardware needs. Then users just download the hardware independent IL Code (byte code in java terms) from the google core and the hardware specific compiler on their specific device compiles the byte code to native code at install time. In both cases when the user goes to run the app they're running native code but the MS way is limited to whatever MS decides to support and puts the responsibility on Microsoft to support new hardware. On the other hand the Google way is not limited by Google and the responsibility to tune the compiler to the hardware falls on the hardware maker who knows the hardware the best (using the Google provided AOSP code as a framework).
  • JohnGalt1717 - Friday, October 2, 2015 - link

    FYI MS also just announced shared .NET fully compiled libraries that will automatically be downloaded as needed. This further shrinks the compiles because they're no longer linking the .NET components into the .exe/.dll and instead they're shared with other apps (which also can save memory because the .net components can be shared although still fully compiled)

    Download usage savings is pretty huge.
  • ruthan - Friday, October 2, 2015 - link

    Hmm, fast start up is good, but biggest C# problem besides portability is performance. Will this help to app performance and how much?

    Because for example Unity3D games engine using C# Mono.Net port and performance is biggest issue of whole engine.
  • Agent_007 - Friday, October 2, 2015 - link

    I would say the biggest problem of Unity performance is OLD Mono version that they use (if they used e.g. Mono 4.0, performance would be much better, specially garbage collection performance).

    But Unity is moving to IL2CPP (iOS and WebGL are already using it) so Mono performance won't matter in future.
    http://blogs.unity3d.com/2015/01/29/unity-4-6-2-io...
  • CSMR - Friday, October 2, 2015 - link

    The flowchart suggests Microsoft is doing it properly with an IL to native compiler. But why then is it C# only rather than supporting any .net language -> IL -> native?
  • Gigaplex - Friday, October 2, 2015 - link

    According to Microsoft's developer site, both C# and VB.NET work with .NET Native. Those are the only 2 .NET languages supported in WinRT I believe.
  • prisonerX - Sunday, October 4, 2015 - link

    Natively compiled languages have been faster than hand written assembler for some time now. There are very few reasons to use it these days.
  • Jaybus - Wednesday, October 7, 2015 - link

    Well, that depends upon the programmer. I think you mean that [some] compiled languages have been as fast as hand written assembler for some time now. It is not possible to beat optimal assembly code.
  • whatacrock - Tuesday, October 6, 2015 - link

    This is pure crap. Nobody cares about Microsoft's stupid store or Windows Phone.

    This will probably never be in desktop applications, which is the only thing people want. Microsoft would rather vainly attempt to capture a market they'll never have than keep the one they've already got.
  • MrSpadge - Thursday, October 8, 2015 - link

    Yeah, nobody wants apps which are available and similar on all their devices and which take care of updating themselves. And the worst: on Win 10 they simply look and feel like other programs! How shall people know what to hate with all that unification?
  • Guspaz - Tuesday, October 6, 2015 - link

    This is a pretty terrible decision. If your code may be compiled to native machine language, but you don't have any way of running that code yourself (because it's only generated when you send it to Microsoft), developers no longer have any way of accurately profiling their application, and most developers (who don't use the Windows Store) won't get to benefit from it.
  • IanHagen - Tuesday, October 6, 2015 - link

    I can't wait for them to update the documentation with the apparent shift on focus from HTML5 to C# regarding "metro" apps that started recently. It's a bloody mess that bizarrely focuses heavily on half-baked JavaScript "universal" apps instead of their in-house C# language. I'd take Cocoa over it any day.
  • da_asmodai - Wednesday, October 7, 2015 - link

    Compiling C# to native code is already available for Desktop apps. It's called NGEN (Native Image Generator) and it's been since .Net 1.1 (2003?) at least.
  • asldkfli2n - Wednesday, October 7, 2015 - link

    NGEN generates code that still runs on the .net framework (its just been pre-jitted into native code).

    .Net Native produces native code that runs on no framework at all (except for a tiny GC that gets baked in)
  • da_asmodai - Thursday, October 8, 2015 - link

    I didn't say they were EXACTLY the same. NGEN is native code though as it's pre-jitted. So it makes no difference to the user at runtime. If you use NGEN on the install machine (similar to how Google works ART) then you have to have the framework there to do the NGEN anyway so would you statically link everything when you're going to need the framework anyway. The framework is pre-jitted as well so it's all running native code at execution time for the user. They COULD make NGEN statically link but then what will that solve, the whole point is to distribute platform independent code and then compile to native on any target device. If they linked in everything so no framework was required then they'd have tons of the same code over and over again. They'd have the copy the IL version they use to NGEN, then they'd have a separate native code copy for each application that used that part of the framework. Instead they have one IL version they use for NGEN and ONE native code version that share by all applications that use that part of the framework.
  • aoshiryaev - Thursday, October 8, 2015 - link

    What's a "complier"?
  • EWJ - Wednesday, November 4, 2015 - link

    A complier is a new name for compiler. The core framework underpinning UWP for which .NET native works compiles to memory. There are no binaries at all on disk, only in Release mode. So there's why Microsoft tends to refer to compile as 'syntax check' as of late and hence the compiler can be called complier ... (yes, pun, true)

    That said, NGEN creates a patching problem, where NGEN could take an hour or more in the Windows 7 era e.g. if sth that was inlined most of the time was patched. And because the GAC is global you'd have to go over all dependencies of all apps to fix one small issue. With NET Native each app has its own dependency on a version of a dll, which would at the very least vary with when the app was published, so they won't all be vulnerable at the same time every time. And the whole framework is much more granular by design (NuGet everything) but if it wasn't it still has a fairly limited footprint compared to the full .NET Framework.

    About Java people noted it's almost 100% comparable, so would benefit from this principle in much the same way. I leave to others to compare versioning and compilation scope - I just don't know. But then Google doesn't seem to patch at all, so there's no argument to be made neither for or against 'Java Native' compared to having it compile on install.

    I don't really think it is a problem that it is 'under Microsoft control'. Actually I could challenge you to design a new (Windows 10) device with a new ISA for which you do not already have a world class C++ compiler. Then how are you going to create firmware?? But if you have it, why not let Microsoft (re)compile it - would it matter at all? Yes, you'd have to give Microsoft your brand new compiler, but I'd guess you'd want all of the world to have it - or who'd be making apps for your ISA. So Microsoft will produce the same binaries as on your own machine. I don't see how profiling is impacted. Rather, if you didn't have time, you can still submit an app for ARM even if you don't have a device that runs it on your desk.

    But checking for nasty security issues, malicious apps, and so forth, I'm happy to see Microsofts take responsibility. I have no issue they use IL code that is easier to analyze than native binaries. Google on the other hand, neither seems to have a watertight patching ecosystem, nor is very effective at keeping users safe from malicious apps, seeing the reports about vulnerability and exploits of the last few years.

    On top of this, expect the whole principle to work on Linux and iOS pretty soon too, all of it compiling from the same sources on Github essentially (although yes true UAP apps are slightly more of an odd beast than a DNX target for which this already holds albeit in beta).

Log in

Don't have an account? Sign up now