gnustep-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Forking on Github and feedback


From: 陈北宗
Subject: Re: Forking on Github and feedback
Date: Wed, 9 Dec 2015 21:51:50 +0800

The decision of deprecating gnustep-make is from its fundamental incompatibility with stepcode-build which takes an Xcode project file. Either stepcode-build have to be written using no GNUstep at all, we allow stepbuild-build to conflict with gnustep-make, or we have a dependency loop at our hands if libobjc and CoreBase depend on it to build. I don't think you will find it appropriate to build a bootstrap version of stepcode-build with a static version of Apple's CoreFoundation and then rebuild it linking against our own. This is the reason I am proposing moving libobjc2 and CoreBase to pure Cmake or plain Makefile so stepcode-build can depend on it. Or if you are so in love with gnustep-make, can you guys make it understand Xcode spec files which are at heart NeXT-style plists?

In fact, the reason of rewriting TFB also have something to do with stepcode-build as with the rewritten CoreBase will have effectively half of Base inside, allowing stepcode-build to be written in Objective-C ARC with Base headers yet without linking against Base. And this allows the rest of Base to be built with stepcode-build. The copying of headers can be done using a plain Makefile.

Apple have long reversed their decision of optional CALayer for UI elements since at least iOS 2.0 - every UIView is backed by a CALayer. In my plan of GUI rewrite the same principal applies to NSView and NSCell too since it reduces the graphics back ends to support down to one: OpenGL. This allow us to deprecate gnustep-back entirely, emit more efficient graphics code, and be Wayland-ready.

And paths? That is up to Mesa (and in turn, GPU) if the path is going to the display.

Sent from my iPhone

On Dec 9, 2015, at 07:08, Ivan Vučica <address@hidden> wrote:

...I'm slightly feeling like I'm biting the bait in responding to this. I don't believe I will participate in further discussions about what sounds like it wants to be a non-upstreamable fork.

On Sat, Dec 5, 2015 at 9:26 PM, 陈北宗 <address@hidden> wrote:

Oh okay then - I will try wrap my mind around CMake, but gnustep-make is still going away.

Because choice is clearly between cmake and gnustep-make. And clearly, *clearly* gnustep-make *has* to go away. :-)

While it surely can be simplified (or replaced by a different tool), have you looked at services that gnustep-make provides?

I mean, who am I to tell you what you will do in your own fork. But as far as contributors that gathered in Dublin this year are concerned, gnustep-make, despite its flaws, is still the tool of choice for a simple reason: universal availability.
 
I will look into the package manager of Swift and create a workalike. gnustep-make is not really friendly and/or useful for the build system that uses Xcode project files anyway.

If a project starts using .xcodeproj build rules, then sure, different definitions should be used for that particular project, and gnustep-make should provide sufficient configuration information to configure this build system.

If you plan to upstream this, it will take strong arguments, a lot of convincing and wide cross-platform support (with low resource usage) to move contributors to that build system.
 
 
4) I need dispatch-io in CoreBase and Base, and don’t forget XPC…

You don't, and you can and should #ifdef those sections out.

Still, XPC? Those NSXPC* classes have to be implemented.

Such classes should be #ifdef'd out. Not every use case would benefit from the interprocess communication system, as provided by Apple.
 
Also this refactor have “support Swift” in mind but that requires Foundation have a hard dependency on libdispatch.

Whatever you wish to upstream should not make libdispatch a 'hard' dependency. 'configure' exists for a reason.

Why do you think these features have to be present under every configuration? Why would one first have to understand and port libdispatch to, say, Android in order to compiile gnustep-base for that platform?
  
13) CG is full of CF classes and objects. Try this:

CGPathRef path = CGPathCreateWithRect(CGRectZero, NULL);
id path_objc = CFBridgingRelease(path);
NSLog(@address@hidden, path_objc.description);

Cocoa's CG is, but GNUstep's isn't. 
As long as the developer should be using it as an opaque pointer, it does not matter. 

I don't know how an Objective-C object is impacted by being passed into CFBridgingRelease(). If it is not satisfactorily impacted, I guess C wrappers around the CGPath Objective-C class in Opal could return an opaque non-Objective-C pointer that wraps this pointer, and other functions could accept it. Someone that's more familiar with ARC could comment on this.

Full circle, buddy.

I don't believe we have interacted enough to make 'buddy' an appropriate, polite _expression_ in this discussion. In fact, it seems to be unpleasantly condescending. Was this intentional?
 
The way you paired OPPath to CGPath is almost exactly how I am going to refactor the entire CF into, but some “exotic” Objective-C feature is used here.

Have I paired OPPath to CGPath? Excuse a bit of sarcasm, but if I have, my memory is worse than I imagined :-)

If you are interested in authorship of Opal, you should use 'svn blame' or equivalent.

For example, a certain non-sanctioned tool lets you do that trivially:

 
15) The Wayland-based graphics stack requires the client program to talk directly to Mesa over EGL so just like Qt, Clutter or GTK+ we have to speak EGL as well, or include yet another dependency (going against my intention of deprecating Back). EGL being a 3D API does not mean we cannot implement 2D on top of it. And I may end up implementing SpriteKit first as a “2D-on-EGL” layer and build the rest of Quartz2D on top of that. X11 is already on its slow way out so I am not going to make a dependency there.

That does not explain how and why you plan to draw circles, bezier paths and text -- which is what Opal does -- using EGL.

The EGL part comes in play because the drawing will be done using GLUT and Mesa which emits EGL directly (text is drawn by converting it into path first.) Essentially now not even Opal is drawing our graphics, the GPU is.

Maxthon... Please accept my sincere apologies for being this direct, but the very phrase "drawing will be done using GLUT" disqualifies you from performing this 'refactor', at least without doing much reading about the graphics stack.

In fact, your casual mention of "text is drawn by converting into a path first" assumes that drawing "paths" not the actual issue at hand, when it very much is. I'll be willing to dedicate a few hours to explain the graphics stack to you and any interested contributors, if the interested contributors would ask. I can even give you useful projects which can be accepted in the core projects in this direction.

To my knowledge, QuartzGL, a.k.a. Quartz2D Extreme is still not a default and raster imaging is done in software.

And if it is truly that beneficial to do it on the GPU, the correct layer to do this (in our implementation) is Cairo, which, in our implementation, lies under Opal and is much more actively developed and much more actively used across the OSS world. Cairo happens to be able to do some compositing on the GPU. If you believe this to be beneficial, I would suggest you investigate how to enable this, and whether it would even be usable with the current design of Opal/gnustep-back with Cairo.
Opal implements a Quartz 2D-compatible API, also often referred to as Core Graphics. EGL and Wayland don't belong there.

There is no way QE on OS X draws any graphics by itself during normal operation. Once my Mac accidentally booted with “standard VGA” graphics mode and QE have to do all the drawing in software, and the performance dropped to the point I cannot even use the menus. This is an obvious proof that Apple’s QE uses OpenGL almost exclusively and it is usually a good practise (as graphics is offloaded to the GPU) whenever you have one.

Maxthon, you are still mistaking Quartz Extreme, which performs scene compositing, for QuartzGL / Quartz2D Extreme, which rasterizes 2D primitives.

They serve different purposes on OS X. Compositing is the exact thing that GPUs excel at, and (as long as many of the composited images are statically resident on the GPU) resolves fillrate issues. It is a completely different assumption that rasterization of the sort that Quartz2D / Core Graphics is performing should happen on the GPU.

In a still-hypothetical gnustep-gui with CALayer integration, CARenderer in combination with a compositing window manager would come the closest to playing the role of Quartz Extreme. Opal-with-Cairo plays the role of Quartz2D, providing rasterized content for CALayers, which are then composited by CARenderer. Windows themselves would get composited by a compositing window manager.

Just as Apple does not enable QuartzGL by default, I don't think it would benefit GNUstep to spend time on attempting to rasterize all the primitives supported by our implementation of dps, and all the primitives supported by Opal. More interestingly, have you looked at how our backends in gnustep-back work? It's not pretty or /that/ simple.

And rasterizing text is itself something that's hard to do as-is... do you really want to make it more complex, with even more things that can go bad (or look bad)?


If you would like to contribute, there's many interesting things that could be done. For example, if you are interested in providing NSXPC APIs, sure, that seems reasonable. (Given that those APIs seem designed to help isolate process permissions when using sandboxing, you may or may not be able to use GNUstep's Distributed Objects for that.) But if you are a new contributor, do approach things with an 'I need advice' attitude, and 'I will try my best not to disrupt others' use cases' attitude. 


reply via email to

[Prev in Thread] Current Thread [Next in Thread]