• 5 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: June 5th, 2023

help-circle





  • Is there a limit to one-time cards

    There should be something about that in the Revolut EULA or something like that. But I’ve never encountered it. The moment the payment goes through, a new card appears in the app

    Can you elaborate But how private your data really is, that might be hard to answer

    It’s a business. A closed source. They are of course bound by laws and regulations but there’s practically no way to make sure they aren’t selling transaction data/statistics under the table. Also, the cards issued by them are either visa or mastercard (IDR), so these companies have that info too. And I’d bet they sell transactions analytics
    Then there’s also the matter of telemetry. Apart from telemetry gathered by the app for Revolut, I guess there’s no way to use it without Gapps

    FWIW I did not notice an influx of spam after registering an account. But that doesn’t prove anything, of course

    We can’t inspect the code of the app. So it’s probably only as private as other bank apps





  • access my documents on my different computers or my Android phone

    I had similar setup but I was using obsidian and pcloud. Syncing up&down was done by scripts using rclone/roundsync (android). Script part might be harder to achieve using windows

    But I came here to say that I finally decided to test syncthing and it’s so much easier! And just works. Now pcloud is rather a backup and sharing than gateway



  • Yes, I can. But you need much more to accomplish this

    1. You need reach: are there any mods/admins that would feel ok with vouching for your abilities? And preferably have info about your proposal stickied on a bunch of communities where it could reach people open to chip in?
    2. You need to convince those you reach that you’re not a Nigerian Prince. Mod/Admin saying you’re legit could help with it but maybe there’s something more you could do to convince the public?
      Maybe I simply don’t know who you are, maybe in reality you are second in command after Dessalines. But either you are a random dev saying “I can do that” - in this case you need to somehow convince others that you really can. Or you are not recognised for your work - in this case you need to point us to what tie you to. I saw the fedi project on your GitHub so you probably can code (I’m not going to be auditing your project in order to asses your skills, sorry). But are you just a dreamer or are you serious?
      I’m sorry if what I’m saying sounds harsh. I just feel that how you are coming through to the other side gets lost in translation here
    3. GitHub is not the most popular support medium. Why not also have Patreon/Koffi/OpenCollective/etc? Many will chip in easier if they’re already present on the platform






  • Using an analogy, somewhere along the road we came from “your lips will look nice with this lipstick” to “I can’t leave a house without a lipstick on”. And raising awareness about issues is basically a marketing too. It has to be in order to make it through to you among all the other noise. It’s just that this time it’s not you who are not thin/tall/curvy/vibrant/… enough, it’s the world that is not safe/happy/wealthy/just enough. I’m not saying that it’s not true but this has a side effect that when you are overstimulated with the ads and news, it does feel like the world is going to end.

    Think about what would really have to happen to purge us all, 100%. Even climate change, I don’t think could do it. 90% maybe, and sure that would be a change in our world. But still not the end of it. I can’t find the paper now, but we only need something like 1000 random people to survive in order to have enough genetic diversity for the species to survive. Something would have to wipe us all at once to really kill us. Otherwise, we’ll just adapt and carry on.

    And if not. Welp, that’s a chance for the UFOs to thrive. It’s not like humanity is required for the wheels to keep turning



  • From what I learned at university:
    CISC instruction set (x86) was developed to adress the technical reality of its time - time costly CPU operation and fast read from storage. Not long after that the situation has changed - storage reads became slower in comparison to computing time (putting it simply it’s faster to read an archive and unpack it than to read unpacked thing). But in the meantime the PC boom has happened. In a way backward compatibility and market inertia locked us with instruction set that is not the best optimised for our tech, despite the fact that RISC (for example ARM) was conceived earlier.

    In a way software (compilers and interpreters too) is like a muscle. The more/wider it’s used, the better it becomes. You can be writing in python but if your interpreter has some missed optimization opportunities, your code will be running faster on architecture with a better optimized interpreter available.

    From personal observations:
    The biggest cost of software is not to write something super efficient. It’s maintainability (readability and debugging), ease of use (onboarding/training time) and versatility (“let’s add the feature that is missing to what we have, instead of reinventing the wheel and maintaining two toolsets”).

    The new languages are not created because they can do something faster than assembler (they can’t, btw). If assembly code is written as optimal as possible, high level languages can at best be as fast. Writing such assembly is a problem behind the keyboard, not a technical limitation. The only thing high-level languages do better is how much time it takes a human to work with it.
    I would not be surprised to learn that bigger part of these big bucks you mention go not into optimization but rather into “how can we work around that difference so the high-level interface stays the same as for more widely used x86?”

    In the end it all boils down to machine code - it’s the only thing that really exists when it comes to executing code. If your “human to bits translator” produces unoptimized binaries, it doesn’t matter how high-level your code was written in.
    And sometime in the meantime we’ve arrived at a level when even a few behemoths like Google or Microsoft throwing money into research (not that I believe they are doing so when it comes to optimization) is enough.
    It’s the field use that from time to time provides a use-case that helps finding edge-case where optimization can be made.
    To purposefully find it? Dumping your datacenter in liquid nitrogen might be cheaper and probably more predictable.

    So yeah, I mostly agree with him.
    Maybe the times have changed a little, the thing that gave RISCs the most kick were smartphones, then one board computers, so not long ago. The improvements are always bigger at the beginning.
    But the fact that some companies are trying to get RISC back into userland in my opinion means that the computer world has only started to heal itself after the effects of PC boom. There’s around 20 year difference where x86 was the main thing and RISC was a niche