View previous topic :: View next topic |
Author |
Message |
cunokyle Member
Joined: 09 May 2024 Posts: 479 Location: Iowa
|
Posted: Sat Jun 28, 2024 10:09 pm Post subject: Lots of games |
|
|
http://loll.sourceforge.net/linux/links/Games/
There are many games there...I am trying some of them out...first on my list is the FlightGear Flight Simulator...downloading now!
|
|
Back to top |
|
cunokyle Member
Joined: 09 May 2024 Posts: 479 Location: Iowa
|
Posted: Sun Jun 29, 2024 4:12 am Post subject: |
|
|
you know, one would think that the game installation process would be far easier in Linux than it actually is! I have downloaded around 7 or 8 games and have managed to get 2 of them to run correctly! Alot of the require files that you have to search for than try to install. It is really too bad that the Windows process is SOOO much easier. :(
http://usalug.org/phpBB2/viewtopic.html?t=225 that is the perfect example of what I mean!
|
|
Back to top |
|
mmmna . . . . . . .
Joined: 21 Apr 2024 Posts: 5100 Location: Centah Bahnstead Nuh Ham-shuh
|
Posted: Sun Jul 06, 2024 3:17 am Post subject: |
|
|
Gonna check out critical mass, a galaga/galaxa knockoff.
EDIT: I feel your pain, cuno: one dependency I need. That dependency needs 15 other programs, which means I gotta research if I have the correct versions of 15 other packages, so I can satisfy one dependency for critical mass. Ugh. So sad.
_________________ Slackware 10.2 with stock kernel
|
|
Back to top |
|
mmmna . . . . . . .
Joined: 21 Apr 2024 Posts: 5100 Location: Centah Bahnstead Nuh Ham-shuh
|
|
Back to top |
|
cunokyle Member
Joined: 09 May 2024 Posts: 479 Location: Iowa
|
|
Back to top |
|
mmmna . . . . . . .
Joined: 21 Apr 2024 Posts: 5100 Location: Centah Bahnstead Nuh Ham-shuh
|
Posted: Tue Jul 29, 2024 2:42 pm Post subject: |
|
|
As 'they' say: Practice makes perfect!
(who is this 'they' anyways?)
_________________ Slackware 10.2 with stock kernel
|
|
Back to top |
|
nukes Linux Guru
Joined: 29 Aug 2024 Posts: 3935 Location: Somewhere just off the M62
|
|
Back to top |
|
mmmna . . . . . . .
Joined: 21 Apr 2024 Posts: 5100 Location: Centah Bahnstead Nuh Ham-shuh
|
Posted: Tue Sep 02, 2024 11:32 am Post subject: Swaret, maybe? |
|
|
Swaret is a slack package tool that hunts dependencies, if that is in line with your thoughts, but it needs a thousand dependencies which are not present in my install, and frankly this massively multipackage dependency (MMPD tm) thing is painful, critically painful in fact. Dependencies may yet break my will to continue. RANT: Developers: why not just embed the widgets from your 'dependencies' into your code? /RANT
APT4RPM and urpmi are probably useful tools, I hear they are anyways. I'm kinda leaning towards source tarballs, and as such, I am asking in several areas about compiling issues. FWIW, I was >< that close to going with LFS.... If I had posessed a functional hardware modem at the time I dropped considering LFS, I would most likely not be sporting a Slackware avatar.
|
|
Back to top |
|
nukes Linux Guru
Joined: 29 Aug 2024 Posts: 3935 Location: Somewhere just off the M62
|
Posted: Tue Sep 02, 2024 8:45 pm Post subject: |
|
|
That's why I use Debian or Gentoo. The net-fetch functionality is built in, and it works.
RPM and SlackPacks weren't designed with this in mind, Slackware tgz files don't even support dependancies.
Quote: | Developers: why not just embed the widgets from your 'dependencies' into your code? |
They can, its called static linking, but it uses more memory and stuff, and is normally considered bad practice, even though it improves performance through not having to load dynamic libraries.
Don't get me wrong, I love slack, I just don't use it anymore as I've found some distros suit me better. It was my first distro (besides Caldera OpenLinux, which I hated and quickly ditched - it came with a book)
_________________ Gentoo; 2.6.11 2.6.17.7 + patches
Debian sid 2.6.13
|
|
Back to top |
|
crouse Site Admin
Joined: 17 Apr 2024 Posts: 8985 Location: Iowa
|
Posted: Tue Oct 07, 2024 2:58 am Post subject: |
|
|
I hate to seem like a moron here..........but with todays huge hard drives, and space really NOT being at a premium in MOST cases anymore, I wonder if dynamically linking is the best practice ? Statically linking definately requires more space, but it also helps eliminate the lack of a stanardized os layout..........
Now I realize that it would be bad practice if ALL programs did that, but wouldn't it be possible to create a program.......and package it that way.....so when ALL other attempts to install fail....... it could be used as a last resort so to speak??
|
|
Back to top |
|
mmmna . . . . . . .
Joined: 21 Apr 2024 Posts: 5100 Location: Centah Bahnstead Nuh Ham-shuh
|
Posted: Tue Oct 07, 2024 3:45 pm Post subject: |
|
|
Dynamic links allow for finer grained revision control within a system, something that developers in general seem bent on implementing. I do not need my system to change on such fine increments as 'nightly CVS' or 'nightly builds'. That focus is the 'Linux as a server' idiom creeping into other areas of code development.
Given the choice, I'd prefer to obtain only software packages that are offered in a 'known good' configuration - where the software includes dependencies which are known to work correctly. In those instances, nothing which is incompatible can be easily misused with the package I downloaded. Including dependencies, eg, static linking, reduces disappointments and reduces user overhead.... IMO, that is one thing M$ gets right, why force me to get a degree just to troubleshoot one bad program?
Forcing dynamic linking allows the developer to cut the bandwidth, a cost which they alone bear. Dynamic Links also allow minimal client side filesystem structures - all dependencies point to the same library, because all libraries are backwards compatible... have the newest version on the client and all is fine.
At least, I think that is what it is all about.
Nukes - for Slackware, look into swaret, if you haven't already.
_________________ Slackware 10.2 with stock kernel
|
|
Back to top |
|
nukes Linux Guru
Joined: 29 Aug 2024 Posts: 3935 Location: Somewhere just off the M62
|
Posted: Wed Oct 08, 2024 7:28 pm Post subject: |
|
|
I've looked into it, but I'm not about to redo my whole setup now. I've got it running fast and clean on Gentoo, and prefer its way of doing some stuff.
Dynamic and static linking is fine, but you still need the software installed to compile from source. Staticly linking is considered bad practice because if you update the library itself, where with dynamic libs every program that used the library gets updated, with static linking it doesn't change, and can break. (Look at the FaD client when a new version of Curl came out)
Quote: | That focus is the 'Linux as a server' idiom creeping into other areas of code development. |
Its not just Linux that uses Dynamic Linked Libraries (sound familiar now?) every Os does it. It would be a bad idea for everything to be staticly linked and any book you find will tell you to load the libraries from /usr/lib etc rather than compiling them in completley. If the binary was built with, say version x of glibc, it wouldn't neccesarily work on your system with version y. Hence all this carry on.
Quote: | I'd prefer to obtain only software packages that are offered in a 'known good' configuration - where the software includes dependencies which are known to work correctly. |
Look no further:
www.debian.org
_________________ Gentoo; 2.6.11 2.6.17.7 + patches
Debian sid 2.6.13
|
|
Back to top |
|
mmmna . . . . . . .
Joined: 21 Apr 2024 Posts: 5100 Location: Centah Bahnstead Nuh Ham-shuh
|
Posted: Thu Oct 09, 2024 12:22 pm Post subject: |
|
|
Dynamic versus Static.... therein lies a big problem.
People who need to troubleshoot a failed package install are frequently newbies. In fact, I believe the quantity of people coming to Linux (newbies are not experienced admins) is a quantity that is greater than there are experienced admins!
When a dynamically linked package installs normally, it can produce, for example, a hundred lines of errors.... the newbie cannot differentiate minor but recoverable soft errors from the fatal errors that get buried in the streaming output.
OTOH, If the package contained everything which always worked and was already known to work good with the package in question....
Besides that, by profession I'm a software QC kind of guy lately.
Which scenario is more likely to succeed: the case where all the correct dependencies are included, or the case where the tertiary software is left to chance? After all, how much effort should an admin need to put forth in order to install, configure and use a given dynamically linked package which failed because of the dependant package being fouled up (as can happen as a result of just one revision)?
I can argue in the Windows realm: ctl3d*.dll is a common dll, but there are no less than 5 versions being passed around by software (FWIW, on this PC at work, W98SE, I have 4 different files that match the string 'ctl32*.dll'). When a later version overwrites an older version, that is supposedly acceptable. I can't begin to count the times I've installed a windows program that wrote the older version over the newer version, and then some time later, a known good program crashes.
I can argue in the DOS realms: compiled software in DOS predates the dynamic paradigm. It was rare when a program did not install correctly. The typical errors were one of not getting along with another program, and some errors were frequently caused by poor programming assumptions. But what you got was usually binary performance: either it all worked or nothing worked.
The 'crashing program' is nearly a direct result of the dynamic linking paradigm!
Now, given the increasing complexity of software design (nothing gets less complaex with age, only more complex), and looking at a packages' dependencies, I see ever increasing opportunities for errors. One dependency alone might cause 2 errors, but where todays packages rely on 5 dependencies, that means up to 5 times the repair work for an admin. Tomorrow, development will add a feature, and add a dependency and possibly add 2 more errors - that's 6 times the repair work. Next year, 8 dependencies, etc.
The probability that a given dynamic package will install correctly within its own time frame (I'm not talking about adding a package to 3 year old dependencies) is a probability that is a decreasing, not increasing numerical value; probability of success is approaching zero.
I agree that dynamic can work, I also agree that fewer issues result from static.
And above all that, why not simply copy the code from the library and paste into your package in the first place? After all, it is LEGAL!
Your turn!
|
|
Back to top |
|
|