Discussion in 'Linux' started by smallhagrid, Jun 17, 2019.
You need to login to view this posts content.
You need to login to view this posts content.
When connecting OSes which are almost 20 years in difference it's not surprising one has to use a very legacy protocol version...this especially applies when connecting (old) windows to (new) Linux.
Shares of M$ windows to Linux and vice versa were /are more of issues than simply shares between 2 Linux OSes, especially when both are 'new'.
Don't think today many are out there trying to connect win2k with recent LTS Ubuntu.
Whenever I could I tried to realize data exchange/backup via (S)FTP.
I am doing that between media server (Linux) / mobile (Android) and Win to have my media files folder up to date.
I simply do that manually wherever the folder needs to be synchronized. (Filezilla / Folder sync app).
I know a real share is different since you use the shares directly remotely and there are no local copies.
Hello Yen & All.
After working in this field for more than said 20 years I find the forced obsolescence of things even more upsetting now than I did all those years ago.
My main reason behind this is simply that probably over 90% of PC users ONLY do all the same basic stuff;
Email, some browsing, writing a letter or such every now & then - and maybe some simple games.
Very few folks by comparison are actually coders, hard core gamers, video producers, and so forth - who actually have some need for all the hottest/newest stuff, OSes & heavy duty security.
For example - what I posted about here is for a tiny office with 3 people in it - all of which are college educated, with one very close to getting her degree - and none of these folks have a clue about how any of their needed tech gear works !!
They depend upon some specific s/w to operate their office, but the rest is all emails, documents and some browsing like pretty much everybody else.
Another good friend of mine is highly skilled in another field entirely - but also intimidated by his tech gear - and after I updated his OS, it totally failed to recognize his HP laser printer, which drove me nuts because the port for it was no longer recognized.
I had to leave it that way after finding no solution.
Turns out that I found some mention of parallel ports being deprecated sometime back in the Ubuntu family - so I called him, he found a USB device cable - removed the perfectly good parallel cable - re-connected it via USB - and boom - instant printer.
Later, I griped about this at the OS's forum and the deprecation was denied - but no fix was proposed either.
There is NOTHING wrong with win2k - I mean that literally;
It is rock stable & the PC with all those files on it is blocked from the internet - so any so-called security concerns are all on me, and there's never yet been a problem thereby, so:
If I need to connect it via a deprecated version of SMB, that should be up to ME - not the commitee of OS makers who have some bizarre NEED to protect me from myself !!!
At the very least, instead of the baffling blockages that I saw - it could have a pop-up message that warns me about it, and then allows me to choose whether or not to do it anyways...
My work - my configuration - should be MY CHOICE !!
Having to struggle & dig & post for days just to find the correct explanation behind the problem itself is too extreme IMO.
At the other end of this comparison - if I had a bunch of '286 & '386 & '486 level PCs using only the ancient coaxial cable types of networking & maybe only DOS to connect somehow for file sharing via ethernet, that scenario would be one in which I would expect barriers & struggles.
But NOT in the scenario described here.
The XP guest VM grabbed its shares with the win2k 'server' INSTANTLY - whereas the highly 'advanced' Linux host OS could not do so directly - or even to properly inform the operator what the problem was ?!?
Meh. Totally backwards.
I'd expect baloney like that from win-doze & NOT Linux.
It is your choice. Anyway you have to live with upcoming issues when interconnecting OSes which are 20 years apart.
Old to old is no problem if still available and running on old machines.
I am with 'PCs' since 1982. Have known lots of different OSes.
Anyway it would be also of disadvantages carrying all the legacy stuff with a modern OS from today.
Had to experience that (S)FTP is a more backwards compatible protocol than SMB. (especially the CIFS flavor/derivative from MSFT)
In this situation there WILL be forward movement - but ATM it is needed to maintain the older OSes for the 2 important, expensive applications that are network shared.
The most expensive app will be replaced with an online service as soon as I can push that ahead, and soon after that the old file server OS will also become Linux based so the sharing nonsense will come to a permanent end.
Between the time that I began the multi-OS adventure there and the present time I have found many great solutions AND learned a lot in the processes as well.
Given that my friend's tiny office is actually quite busy (he has a very good reputation in his field...) and that it is ongoing with the work sometimes running into weekends, I have ensured the needed continuity until such time as changes can be slid carefully into place...just aiming to keep things going smoothly for him.
And speaking of really old OSes being used in mission critical places:
How about all the hospitals that still rely upon OS2, and:
What about all the industries which still rely upon COBOL & FORTRAN ??
Compared to those systems, using XP or win2k is incredibly modern !!!
Last winter I ditched a scientific OS/2 FTIR Spectrometer workstation. It had a MO-disk drive but was not connected to a network. It was still working.
2 years ago we still had an UNIX workstation with NMR Spectrometer. When we migrated our clients from XP to w7 the samba shares did not work anymore, though.
When you have a small network where you can administrate yourself it's easier to include old clients since you can run an old server for them.
Also when you have single workstations old scientific clients are no problem.
The more different generations of operating systems you have to interconnect the harder it is to realize.
This applies to bigger enterprise networks where there is an own administrative section with QA and QM.
At scientific use you frequently have to deal with this situation:
The scientific device was hell of expensive. It had been validated together with the OS and the PC's hardware.
The OS and PC get old, the device itself still delivers usable data.
You cannot upgrade the OS since the device driver is proprietary and not available for newer OSes /PCs.
The maker of scientific devices want to sell new ones. Anyway they are not implicitly 'better', or a replacement is far too expensive compared to what the new one could do better.
So we also keep running a few old clients on old OS.