Operating Systems > Linux and UNIX

Ubuntu: The Verdict

<< < (5/9) > >>

piratePenguin:
Oh for fuck sake. I'm after typing up a fecking huge reply in Firefox and just lost it due to middle-clicking outside the textbox. :mad:


--- Quote from: worker201 ---Can't help but notice that this is a form of computer fascism.
--- End quote ---
You are wrong. I wouldn't force this system on anyone.

--- Quote from: worker201 ---Consolidating and centralizing power/packages leads to dependence and inefficiency.

--- End quote ---
I get the dependence part. You could say that about anyone or anything that works together with someone or something else.

Previously I depended on Microsoft, then the Mandriva developers, then X, then Y. Which would you trust more though, out of them or my system? That's the important thing.

I don't get the ineficient part. Please elaborate on that. The main pro of my system, from what I can see, is that it makes our curently very inneficient system as efficient as possible. Curently, when there is a bug in some package, say, zlib, FreeBSD, Debian, Gentoo etc. are ALL working on a DIFFERENT patch and apply it to their own repositories. Inneficient.

--- Quote from: worker201 ---Overall, Linux is not developed this way

--- End quote ---
If it was it would be an efficient system and I could not dream of improving upon it, and this discussion would not be taking place.

--- Quote from: worker201 ---the community will resist your attempts to steer it toward some sort of homogenization.

--- End quote ---
If you think that I intend "to steer it [the community] toward some sort of homogenization", then you are mistaken.

--- Quote from: worker201 ---The "do what you like" marketing theory has been put to the test, and actually seems to produce quality products.
--- End quote ---
So you believe that it's that "do what you like" "marketing theory" that is the reason we have such high quality free software? I believe otherwise.

--- Quote from: worker201 ---If you start making everybody do the same thing, that's Microsoftism.
--- End quote ---
If I force them to, then maybe. I'm not gonna force anyone to do anything, so don't compare me to them fucking gaylords please.
 
--- Quote from: worker201 ---The beautiful thing about standards is that there are so many of them!
--- End quote ---
And what have I been thinking about doing the last while? Deleting standards? Is that what you think I intend on doing?

Creating standards, maybe.


--- Quote from: ksym --- That is not possible.
--- End quote ---
In which case it will be abandoned as soon as all hope is lost.
 
--- Quote from: ksym --- You see, every Linux-distro has it's own base-system.
 Each piece of software is uniquely tailored for
 this base-system, statically --prefixed under /usr or /opt.
 Each systems has it's own scheme on dealing with
 soname dependencies, command-namespace dependencies and
 package upgrading.

--- End quote ---
Whether these facts are a good or bad thing for the different distributions is arguable. Anyhow, like I've said before:

--- Quote ---The distributors could even compile the "universal packages" for their users, and package them in RPM or DEP format and put them into their own repository. They would still gain from faster bug and security fixes. The only thing that would be missing is the users control over which patches are in use (which might cause issues for users of certain (noob) distributions). But it would have it's benefits.
--- End quote ---
Maybe that way ^^ should be the standard, but that's not exactly up to me (whover adopts it will define which method is 'standard').

--- Quote from: ksys --- What this come's down to, is that a centralized repository
 would need the distros' using it to be of the same
 base-system-schema. And if that would be so, then they
 would be, actually, ONE AND THE SAME SYSTEM ;D

--- End quote ---
Did you miss the whole patches bit? And the whole distriputors-may-compile-own-packages bit?

You appear to not be understanding much of anything TBH. How did you cope with the can-be-shared-between--different-OSes bit? "SAME SYSTEM" yea fucking right.

--- Quote from: ksys --- AND we would have to brainwash EVERY fucking OSS-hacker
 to believe into our "one-and-the-only" base-system
 in order to make em port their software to our
 Nazi-Linux.

--- End quote ---
No brainwashing. What I had in mind, is educating them and then letting them decide for themselves. But whatever.
 
--- Quote from: ksys --- The idea is good, but it just would not work.
--- End quote ---
That's what you think.

--- Quote from: ksys ---Like  I said earlier, in order to make OSS scene co-operate,
 you would have to be GOD, and throw all nay-sayers
to burning hells. Got it?
--- End quote ---
I'd be glad to prove you wrong. But wait, already done. They are co-operating, just not good enough.

Anyhow. This system I have in mind. I see nothing but benefits it could bring. Better freedom (user chooses what patches are applied. Distributors may use the universal repository to compile own binary packages for use by it's distro's users. The user may not need to know of the existance of the universal repository). Better convenienve (all source code and patches in the same repository. Can be compiled easily and cleanly (with the right patches)). Better cooperation and inherently, efficiency.

worker201:
Perhaps something like this could work.  Here's what you would need to do, I think.  Have your distribution system work like php.  Then the source code to all these programs gets dropped into the database.  When my computer running FC4 with stops by to pick up the latest release of transcode, the package manager looks at my system, and determines what flags are required to create a package custom-suited to my needs.  The package manager then gives these requirements to the distro system, which produces a package custom-fit for me.  It would also store a compressed copy of the package in the database, just in case someone else with similar requirements comes for the package.

In this system, packages are built on the fly based on distro.  So anyone using Fedora, Gentoo, YDL, SuSE, Debian, or some other Linux could get a package from it.

Of course, this is rougher than it sounds.  Basically, the package manager client is handing the distro system configure and compiler flags, and the system then builds an rpm (for example) with those criteria.

Anything else might seem like forcing a standard.  I think that being able to choose between apt, yum, rpm, yast, up2date, slapt, and others is part of what makes computers so cool - it takes all kinds.  Providing an efficient and simple way to get their packages, well that's fine.

(much of my last post was political hooey, although I do think Slackware is excellent proof that dollar capitalism and market pressure are not necessary to make a quality free product.  Patrick Volderking does it because he loves it, and everyone benefits from his love.  If only cars and keyboards were made that way!)

piratePenguin:

--- Quote from: worker201 ---Perhaps something like this could work.
--- End quote ---
There is hope!

--- Quote from: worker201 ---Here's what you would need to do, I think. Have your distribution system work like php. Then the source code to all these programs gets dropped into the database. When my computer running FC4 with stops by to pick up the latest release of transcode, the package manager looks at my system, and determines what flags are required to create a package custom-suited to my needs. The package manager then gives these requirements to the distro system, which produces a package custom-fit for me. It would also store a compressed copy of the package in the database, just in case someone else with similar requirements comes for the package.
--- End quote ---
Sounds good.

--- Quote from: worker201 ---In this system, packages are built on the fly based on distro.
--- End quote ---
Hmm hmm... I dunno about that TBH. Although, they could provide patches for each and every package to make it compile exactly correctly for their distribution. Which I probably would've needed anyhow. In which case, such a system would (read: should) be piss easy to implement. Wouldn't even need to use the precous resourses of the core repository server(s).

--- Quote from: worker201 ---Anything else might seem like forcing a standard. I think that being able to choose between apt, yum, rpm, yast, up2date, slapt, and others is part of what makes computers so cool - it takes all kinds. Providing an efficient and simple way to get their packages, well that's fine.
--- End quote ---
Well, the raw core repository will still be open for reading by people like moi. And there'd always need to be some easy way to get packages from that core repository even for the distributers to get. Making the packages simple to compile (As in, straightforward like './configure && make && make install') is one goal. The distribution-specific patch for every-single package is a requirement for that. Althogh.. maybe it could be worked around..... Like, the patch used by Fedora for gzip would be pretty similar to the patch used by Fedora for bzip2 and tar and binutils and coreutils, but that'll need to be looked into. What all is different 'tween distros? (--prefix, and I know little more (probably --manpagedir and friends). Then there's library stuff that I know nothing about.).
If automation worked (as in './configure && make && make install' worked flawlessly all the time on every major distro) this system could be classic. Then that web-based thing would be possible, as well as 'upkg tar' on every single distro, to get the tar source code, apply whatever patches you select (or have an -auto option) , compile and install.

Anyhow, I spent the last 3 hours typing this out (I took my time, was browsing and stuff while doing it.). It's not necesarily completely complete, I've got more to add I think, it goes into quite alot of detail...:


--- Quote ---The universal package repository contains all the source code, untouched. The exact version of the source code in the repository is the exact version of the source code as retrieved from the package author (usually from the package's website). Once the source code is in the repository, it is never modified. Instead, patches are stored in a different directory and applied before the source code is compiled. This provides for added flexibility and freedom, because whoever is compiling the package (usually a distributor or user) has the added freedom of choosing which patches get applied to what they install.

Patches will be given a number used to determine their importance, as evaluated either by the package maintainer or a privileged group of individuals (who have obviously gained their privileges. The main people fitting into this category would be security experts and the like.). Distributors, who are generally expected to provide frontends to the official command-line tools used to access the core repository, may override the patches importance value. They could also submit recommendations to the maintainers of the package about the patch importance value.

If the maintainers are iresponsible, someone may contact the core maintainers who have the power to remove package maintainers from their duties and add replacements for them. When the user tries to compile or download a package from the core repository, they may use either the official command-line tools, or the frontend that usually comes with the distribution. Either way, they will be given a list of available patches, as well as their descriptions, importance value (either directly from the core repository or from the distribution's overrides (only available when using the distribution frontend, or any frontend using an updated distribution-specific settings (likely retrieved from the distribution website.).).

There is one rather special patch, obey_uni-pkg-standard.patch. This patch usually only patches the configure (TODO: learn about and probably mention Makefile.in and friends here, assuming they are relevant (which I _think_ they are)) script provided by most packages. It makes the package obey the uni-pkg standard for installing packages. The uni-pkg standard has yet to be defined, but by the time this system gets implemented, assuming it does get implemented, we expect that this standard will be clearly defined. It will only be provided for packages that do not already obey the standard, and probably not even that. Distributors are expected, if they offer source packages to their users, to provide a similar patch for their distribution setup in a distribution-specific folder of the repository. This folder should hold absolutely nothing else.

Whenever a bug is found, a patch is made by the distributors or others (who all operate together), and sent to the package maintainers for inclusion in the repository. When the package maintainers add it to the repository, they give it a very high importance value, especially if it's a fix for a security bug. When users update that package, they will get, possibly among others, this patch and it would be applied to the source code and the package rebuilt and reinstalled. Distributors could automate this process in frontends. Anyone installing the package latter on will see that it is an important patch and will (usually) include it when choosing which patches to apply to the source code. There may be a sub-directory of the patches directory of each package for storing experimental patches, purely for testing purposes.

The original package creators are more than welcome, and recommended, to use the patches from the repository, and include some of them for a next release. Whenever a new version of a package is released, the source code is added as an entirely new package to the repository with a fresh and empty patches directory. Any patches from previous version still relevant may be copied across after optionally being modified. Now, whenever a user updates a package, they will be told about the newer version, and most likely will chose to download it instead (they will be recommended to), apply whichever patches are available and appeal to them, compile and install. The older version would be uninstalled also. Distributors may disallow updating certain packages for whatever reason, but only if the user uses their frontend.

That's source packages. Source packages have their advantages and their disadvantages. As does binary packages, discussed now. Binary packages, officially, are not supported. However, the repository stores all the source code and the patches. So, the distributors may compile the source packages, package them under their own package format and distribute them to their users through their own repositories. Tools are likely to be built to automate this purpose, though they will not officially be supported.
--- End quote ---

worker201:
Wow, this is soooo off topic.

I think the system could be even easier than that.  I've never had to apply a patch before, and I think your patching system might be avoidable.  I use apt for my packages, and instead of releasing patches, they release minor or micro version releases.  Like, if foo-1.4.5 gets a really small tweak, it comes as foo-1.4.5-a or something.  The apt system just kills the old one and installs the new one.  So instead of having a complicated patch system, perhaps a micro-versioning system would be more efficient.

An example of how things could work:
Let's say I want to install transcode-1.0.0.  Here's the actual configure line I used when installing transcode-1.0.0b:

--- Code: ---% ./configure --enable-mmx --enable-sse --enable-sse2 --enable-freetype2 --enable-lame --enable-ogg --enable-vorbis --enable-theora --enable-libquicktime --enable-a52 --enable-libmpeg3 --enable-libxml2 --enable-mjpegtools --enable-imagemagick --with-libavcodec-includes=/usr/include/ffmpeg
% export CFLAGS="-O2 -fomit-frame-pointer -mmmx -msse -mfpmath=sse"
--- End code ---

Instead of all this hassle (which I actually kinda enjoy), there should be some kind of intelligent program which will bring up a dialogue asking me what options I am interested in, and recognize what options I have resources for.  Let's say I don't have libtheora installed.  Then the program says "Get and enable theora support?" and then maybe have an explanation of what theora is.  If I say yes, then it writes --enable-theora to a config script.  Of course the configure script already has the personalized stuff I need in it, like hostname, arch, and all that crap.  Then it gets the source and builds a package in via my packaging system, and installs it.  As an option, I can store the package locally, or delete it after installation.  Whether I delete it or not, the configure info is kept, so replacing the package is easy enough.

You know what, this is starting to sound like not much more than a giant CVS system.  Except you don't give the code back after you check it out.  Like a library where you get to keep the books.  I bet all the technology to do this could actually be scraped out of some existing things, like cvs, curl, doxygen, autoconf, and automake, for example.

Just a thought.  I don't even know if we're talking about the same thing.  What I envision is a system that devlivers code to the client, who then personalizes it.  No need for developers to waste time on building installation packages, and no need for users to google all day trying to find the right package for their system.  Your computer gets the source and knows what to do with it.

It would also be nice to have a smart archive, too.  So if I want to get a program that splices mpeg movies together, it will recommend one for me.  And then get it.  Instead of me having to read untold pages of documentation before finding out that mpgtx is the program I want.

piratePenguin:

--- Quote from: worker201 ---I think the system could be even easier than that. I've never had to apply a patch before, and I think your patching system might be avoidable.
--- End quote ---
'patch -Np1 -i ../patches/fix-whatever.patch', simple as that. And it'd be automated. It'll ask what patches you want in, then it'll path the source code, then it'll compile, then it'll install.

--- Quote from: worker201 ---I use apt for my packages, and instead of releasing patches, they release minor or micro version releases.
--- End quote ---
Patches are staying. Distributions may use microversions in the packages in their repositories, using the patches they like from the universal repository.

--- Quote from: worker201 ---So instead of having a complicated patch system, perhaps a micro-versioning system would be more efficient.
--- End quote ---
Patches are more efficient and less complicated. I dunno how micro-versioning could possibly work in this system, unless the distributors do it with their packages (easy).

--- Quote from: worker201 ---An example of how things could work:
Let's say I want to install transcode-1.0.0.  Here's the actual configure line I used when installing transcode-1.0.0b:

--- Code: ---% ./configure --enable-mmx --enable-sse --enable-sse2 --enable-freetype2 --enable-lame --enable-ogg --enable-vorbis --enable-theora --enable-libquicktime --enable-a52 --enable-libmpeg3 --enable-libxml2 --enable-mjpegtools --enable-imagemagick --with-libavcodec-includes=/usr/include/ffmpeg
% export CFLAGS="-O2 -fomit-frame-pointer -mmmx -msse -mfpmath=sse"
--- End code ---
Instead of all this hassle (which I actually kinda enjoy), there should be some kind of intelligent program which will bring up a dialogue asking me what options I am interested in, and recognize what options I have resources for.
--- End quote ---
That could be added into the client, I think.

--- Quote from: worker201 ---You know what, this is starting to sound like not much more than a giant CVS system.
--- End quote ---
It is alot like CVS, but it is not the same. We couldn't use CVS in the repository, because you wouldn't be able to chose which patches get applied.
--- Quote from: worker201 ---I bet all the technology to do this could actually be scraped out of some existing things, like cvs, curl, doxygen, autoconf, and automake, for example.
--- End quote ---
Alot of it will be.

--- Quote from: worker201 ---Just a thought. I don't even know if we're talking about the same thing. What I envision is a system that devlivers code to the client, who then personalizes it. No need for developers to waste time on building installation packages, and no need for users to google all day trying to find the right package for their system. Your computer gets the source and knows what to do with it.
--- End quote ---
Automation will be possible, but because there are so many distributions, they each need to provide a patch for each package to make it compile properly for _their_ system. Then the client can do the rest easily.

--- Quote from: worker201 ---It would also be nice to have a smart archive, too. So if I want to get a program that splices mpeg movies together, it will recommend one for me. And then get it. Instead of me having to read untold pages of documentation before finding out that mpgtx is the program I want.
--- End quote ---
I'm sure that could be added to a frontend or something.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version