• 0 Posts
  • 90 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle

  • I was already a dev in a small IT consultancy by the end of the decade, and having ended up as “one of the guys you go to for web-based interfaces”, I did my bit pushing Linux as a solution, though I still had to use IIS on one or two projects (even had to use Oracle Web Application Server once), mainly because clients trusted Microsoft (basically any large software vendor, such as Microsoft, IBM or Oracle) but did not yet trust Linux.

    That’s why I noticed the difference that Red Hat with their Enterprise version and Support Plans did on the acceptability of Linux.



  • CRT monitors internally use an electron gun which just fires electrons at the phosporous screen (from, the back, obviously, and the whole assembly is one big vacuum chamber with the phosporous screen at the front and the electron gun at the back) using magnets to twist the eletcron stream left/right and up/down.

    In practice the way it was used was to point it to the start of a line were it would start moving to the other side, then after a few clock ticks start sending the line data and then after as many clock ticks as there were points on the line, stop for a few ticks and then swipe it to the start of the next line (and there was a wait period for this too).

    Back in those days, when configuring X you actually configured all this in a text file, low level (literally the clock frequency, total lines, total points per line, empty lines before sending data - top of the screen - and after sending data as well as OFF ticks from start of line before sending data and after sending data) for each resolution you wanted to have.

    All this let you defined your own resolutions and even shift the whole image horizontally or vertically to your hearts content (well, there were limitations on things like the min and max supported clock frequency of the monitor and such). All that freedom also meant that you could exceed the capabilities of the monitor and even break it.


  • In the early 90s all the “cool kids” (for a techie definition of “cool”, i.e. hackers) at my University (a Technical one in Portugal with all the best STEM degrees in the country) used Linux - it was actually a common thing for people to install it in the PCs of our shared computer room.

    Later in that decade it was already normal for it to be used in professional environments for anything serving web pages (static or dynamic) along with Apache: Windows + IIS already had a lower fraction of that Market than Linux + Apache.

    If I remember it correctly in the late 90s RedHat started providing their Enterprise Version with things like Support Contracts - so beloved by the Corporates who wanted guarantees that if their systems broke the supplier would fix them - which did a lot to boost Linux use on the backend for non-Tech but IT heavy industries.

    I would say this was the start of the trend that would ultimately result in Linux dominating on the server-side.



  • Around here, Portugal, were every Summer the temperature exceeds 40 C for at least some days in August, we have outside rollup shades on every window, so one of the tricks is to keep the shades down and and the windows closed during the hottest and sunniest parts of the day, at the very least the afternoon.

    Then at night you open the windows and let the cooler night air in (even better if you do it early morning, around sunrise, which is the coolest time of the day).

    Note that this doesn’t work well with curtains or internal shades, because with those any conversion of light into heat when the light heats the shades/curtains (as they’re not mirrors and don’t reflect all light back) happens inside the house and thus that heat gets trapped indoors.




  • Almost 30 years into my career as a software engineer, I’m now making a computer game that takes place in Space and were planets and comets follow Orbital Mechanics, so I’m using stuff I learned at Uni all those years ago in Degree-level Physics, since I went to university to study Physics (though later changed to an EE degree and ended up going to work as a software developer after graduating because that’s what I really liked to do).

    I’ve also had opportunity to use stuff I learned in the EE degree for software engineering, the most interesting of which was using my knowledge of microprocessor design during the time I was designing high performance distributed systems for Investment Banks.

    (I’ve also used that EE knowledge in making Embedded Systems - because I can do both the hardware and the software sides - though that was just for fun)

    Also, pretty much through my career, I would often end up using University-level Mathematics, for example in banking it tended to be stuff like statistics, derivatives and integrals (including numerical approach methods) whilst game-making is heavy on trigonometry, vectors and matrices.

    So even though I never formally learned Software Engineering at University, the stuff from the actual STEM degrees I attended (the one were I started - Physics - and the one I ended up graduating in - Electronics Engineering) were actually useful in it, sometimes in surprising ways.

    At the very least just the Maths will be the difference between being pretty mediocre or actually knowing what you’re doing in more advanced domains that are heavy users of Technology: I would’ve been pretty lost at making software systems for the business of Equity Derivatives Trading if I didn’t know Statistics, Derivatives, Integrals and Numerical Approach Methods and ditto when making GPU shaders for 3D games if I didn’t know Trigonometry, Vectors and Matrices.

    And this is without going into just understanding stuff I hear about but are currently not using, such as Neural Networks which are used in things like ChatGPT, and Statistics are invaluable in punching through most of the “common sense” bullshit spouted by politicians and other people played to deceive the general public.

    Absolutely, you can be a coder, even a good one, without degree level education, but for the more advanced stuff you’ll need at least the degree level Maths even if a lot of the rest of your degree will likely be far less useful or useless.



  • Clearly it’s not Infosec!!! ;)

    How do you provide value when that value isn’t needed at the moment?

    Well, that’s why a lot of people want to change things at a political level - the great “pure competition no safety net” neoliberal take on Society results in most of people, whose job is basically a commodity and who don’t have a “unique value proposition”, to be pretty close to slaves in this system because since they are human beings and naturally need food, water and shelter continously but are in an environment where the access to those is controlled by having unusual amounts of the very thing that people selling commoditized services cannot get enough of via their work - money - are squezed into a position where they de facto don’t have any choices, nor do they even have the necessary space to invest in themselves to change into some other job where they might have a “unique value proposition”.

    This situation could be changed if people were guaranteed access to the basic essentials, for example via a Universal Basic Income, since even people doing commoditized work would then have the choice to refuse to “sell” their work if they found the “price” too low or the conditions too bad, which would push the market to improve the jobs offers for those (who are by far the majority if people) plus a lot of those could even chose to improve themselves or their skills, become inventors, or artists, or work in areas with high social value but low “price” because they felt rewarded by it in ways other than money.

    In summary, I think there is no solution within the current paradygm since it makes this problem systemic and any viable solution requires changing at least some things in the paradygm, most noteably the part were the basic required essentials of human beings are used to, at the systemic level, force most people into a no-true-choice neo-slavery.

    The changes we’ve seen to the paradym in the last decade or two are exactly in the opposite direction: the ever more expensive housing and even destruction of the social safety net are forcing even more people to accept bare-minimum near-slavery work just to survive.


  • “Most companies” is not necessarilly the same as “most jobs” since some companies (i.e. large ones) offer many more jobs than others. What counts from how much jobseekers see this kind of practice is “most jobs” so you can have just some companies doing this but if they’re the last ones, that means “most jobs” have this kind of thing. It was probably a needless distinction for me to make in that post.

    I don’t dispute the point that people who are in or seeking employment should not reward bad practices like that, I’m explaining what the previous poster meant: that in the present day economic conditions, most job seekers, whilst not not wanting to reward bad practices do not de facto have the choice to do so because they’re under huge pressure to get a job, any job, as soon as possible.

    Also your theory of hustling your way upinto valued roles is hilarious in light of my almost 3 decades out in the job place - since pretty much the 90s the main way to progress up the career ladder, requires that people change jobs - at least in expert areas, the average salaries of people that stick to one employer are much lower than the average salaries of people who switch jobs periodically because people negotiating a new job whilst still working in the old job, will only ever accept a better job - so their conditions will improve - whilst people in a job and not looking are seldom offered better conditions unless they at least start simulating that they’re working for a better job. I mean, it’s possible to progress without moving jobs especially early in one’s career and under good management, it’s just generally slower and harder than if just hopping jobs.

    I don’t even disagree that being choosy in what jobs you take is how people should behave is they can: I’ve actually successfully done that for all but one of my job transitions, but that’s because I’m a (modesty on the side) well above average senior expert in a high demand area, hence I usually get a lot of offers if I put my CV out and since I’m well paid I have a large pile of savings to rely on during periods between jobs, and thus I can be choosy (and the only time I had to “take a shit job” was exactly early in my career, after the Year-2000 Crash, when after 6 months out of a job and running out of savings I had no other option, and 11 months later after searching for a new job from Day 1 there, I finally found a better job and moved).

    Most people in this World aren’t in such a position and casually suggesting that other people act as you suggest, shows a huge level of ignorance of the economic conditions of most people out there nowadays.

    The kind of wording you use on this suggests you’re in a position of reasonable properity and power in the market place as a job seeker in your area which while good for you is not representative of the median experience of job seekers out there, just like my own situation is not.

    Giving like that “I’m alright Jack” “Everybody should do it just like I can now that I am were I am” suggestions to other people whilst ignoring that most are “Not alright” and not in the same position as you, is at best insensitive and ignorant, at worst insulting, which is probably why you’re getting downvotes.


  • I think that the point is that if those practices are for most employment places in a domain (i.e. the bad practices need not be done by “most companies”, just the largest ones) and people’s main concern is having to eat, they don’t generally have the time to look for the jobs where this shit doesn’t happen and even if they do, they would be competing for a small number of jobs against everybody else also looking for those jobs.

    Or putting things in another way, your idea that somebody can simply “only go for the good jobs” fails at two levels:

    • At an individual level, in the absence of clear upfront information about which jobs are good, people who are between jobs and getting squeezed by high money outflows with no inflows (i.e. the one’s genuinelly concerned about their “need to eat”), can only search for and be demanding about the quality of the jobs they apply for until they get to the point were they just have to take whatever job they can before they run out of savings.
    • At a systemic level, if everybody is going for the few good jobs, there won’t be enough jobs for everybody. Now in an ideal world were people could hold on for a good job as long as it took - i.e. if people weren’t pressed by the need to eat - the bad jobs would dissapear (because they would never find any takers) and all jobs would become good jobs. Once again “people need to eat” means that idea of yours won’t work at a systemic level.

    Your idea to “exclude from consideration companies that do this” only works for some people, not all people, and only those people who have enough savings or low enough money outflows to not have to concede defeat and take one of those not so great jobs because they’re running out of money.

    So the previous poster’s comment of “we need to eat” neatly encapsulates in a simple sentence the reason why your idea won’t work as a general practice or even as an individual practice for most people in the present day society and economy in most countries.


  • Somebody who is not a software developer or is a junior one who only ever worked in one or two major projects and got lucky (really depends on the country and the industry) might believe it.

    It’s hardly unusual for people who only ever worked in one place to think everything is like that, and some of those do get lucky (not all software development environments out there are like the US Tech Industry) and end up right after Uni in a place with some good senior techies that make sure environments are properly set up.

    Also in-house development in industries were software is mission critical and new versions breaking Production might result in massive losses or death (for example, Finance) always have proper Testing and Staging environments - you don’t really want to lose millions of dollars (possibly hundreds of millions if unlucky) by having all the traders in a Trading Floor twidling their thumbs because somebody didn’t do, before pushing to Production, proper integration testing in Staging of some comms protocol changes done for two different systems.




  • If their priorities were to track customers, incentivise game integration with their store (i.e. gamemaker lock-in) and the possibility of taking games away from customers, all like Steam does, they would not maintain that glaring backdoor for all those priorities that is letting customers download full installers that they can keep and which do not check back with the store on install.

    I’m sure that they would like the advantage of tying people (both gamers and gamemakers) to their store, yet clearly they’re not forcing that as Steam does, so what they’re prioritizing (in other words, their priority) is clearly not that.

    Given that their unique selling proposition is “no DRM” or more broadly “customer freedom to use the games they bought”, it makes sense that that is GOG’s overriding priority, even if they would also like all the (for a store) nice side-effects of built-in DRM and phone-home installers like Steam’s.



  • The funny bit is that the example from the video requires a companion device in order to actually have something to send, since it doesn’t have a keyboard or microphone, and the natural companion device for that in a mobile situation would be … an Android or iPhone.

    That said, it would also work paired with a tablet and maybe it can just be paired with a bluetooth keyboard (the hardware in it - specifically the ESP32 - has bluetooth support built-in, so it depends on the software)

    Mind you, this is only a limitation from this specific implementation (which is basically a gadget for electronics hobbyists hence no built-in keyboard), not from the LoRa stuff itself.