Jump to content

All Activity

This stream auto-updates     

  1. Past hour
  2. Yup, I look forward to see what they will do with MSFS 2024 👍.
  3. Both the Hawker Hurricane and the Singer Sewing Machine have Check Bobbins warning decals. 🙂
  4. Thanks for the information. I agree with you though, that I think the costs will come down a lot, as well as increased efficiency over time. Similar to how costs came down, paired with increased efficiency, for chess programs over the years, you can now afford a chess program in 2024 that is even better than Deep Blue from 1997 (and Deep Blue cost a fortune to run back in 1997).
  5. You can turn the AI traffic to basic only. So only ATC will "cost" you. But since ATC also talks to the other aircraft, this might indeed get a bit out of hand. However, we don't know how pricing will change once SI has AI traffic. They also have to pay for the voices for AI traffic and ATC and most likely the 30$ are not enough for this. They will have to pay the same amount to OpenAI for voices as BATC, and since they demand even more from OpenAI (the actual AI logic, i.e. the text, which also has to do this logic to each and every AI aircraft !), their costs have to be much much higher than with BATC. Therefore I either expect them to not introduce AI traffic at all or massively increase their pricing. Since the latter is not an option because people think already it's too much for one (!) aircraft, I assume they will not introduce AI traffic or at least have ATC not talk to it. I can't imagine how this would work out in any other way.
  6. Apparently it just got a graphics and sound upgrade in the new update!
  7. Looks like another delay! Maybe by May 21st! Delay, delay, delay! Sounding like a familiar theme! 😲
  8. It’s the aircraft. On their Discord people say autotune doesn’t work with the CRJ for some reason. Maybe Aerosoft need to update something (wouldn’t hold my breath) or BATC might find a workaround…
  9. So actually, I did some reading on what you said and yes, it does seem like the compute power required to run ChatGPT, especially with all the users using it for free, is a loss maker for OpenAI at the moment (the estimate is some 200 million users using ChatGPT but I'm not sure what percentage are free users and what percentage are paid users). But I think there will always be a free version of ChatGPT/Co-Pilot/Bard for two reasons: Throttling and keeping a compute cap on each query for free accounts, which lowers the quality of the answer returned (the compute cap can sometimes leading to the famous "hallucination" of answers) Improvements in technology over time will lower the cost per query With respect 2., I simply cite you this excerpt about Deep Blue versus modern day chess programs that run on a desktop or laptop computer: Deep Blue back in 1997 required a room full of super computers to beat Garry Kasparov. So it would have cost an average person to have a chess program as advanced as Deep Blue back in 1997, hundreds of millions or billions in USD to afford it (or whatever they spent on Deep Blue back in 1997). But with improvements to hardware, and improvements to the software/algorithm over the years, everything gets cheaper. Now, you and I can afford a chess program that can beat Deep Blue in 2024, for a fraction of the price. I also see that they are coming up with "AI Chips" to replace the GPUs used behind the compute power for AI. Looks like Microsoft themselves have come up with an AI chip that will start to replace some of the NVidia GPUs they are using to do the computation for AI. When the industry comes up with tailor made chips for AI (just like they came up with tailor made ASIC chips to mine Bitcoin, which phased out GPUs), the cost will come down further. With respect to 1., OpenAI, Microsoft, Google, etc, can throttle the number of questions and cut down on the compute power before returning an answer for a particular question: With respect to cutting down on the compute power for each question, this is akin to chess programs that return an answer earlier, which saves on compute time. For example, instead of computing 1 million chess positions, it only computed say, 10,000 chess positions, which is a lot cheaper than 1 million chess positions (I am making up numbers for chess programs, I don't know how many positions a modern chess program can compute). So I think there will always be a free version of Bard, ChatGPT, etc. Of course there will be paid versions as well, for companies that require more compute power and more accurate answers. And OpenAI is also making money by charging companies to use their API, etc. But I think for everyday users, there will always be a free version, as long as you have companies like Microsoft, Google, Amazon, and Apple competing against each other (technically, Co-Pilot isn't free because you have to own a version of Windows, and if Apple comes out with AI too, you would have to own an Apple product to use it).
  10. Very strange, this line does nothing other then write debuging-infos to the logfile. But If it's working now, then all is okay 🙂 Oxynator was original only designed for the A2A C182 and also was tested with the A2A Comanche. Oxynator will never work with non A2A aircraft, because the gauge used A2A internal functions/variable that other aircrafts don't have. I also can not gurantee that the gauge will work with every A2A aircraft.
  11. Because they're treating it as an investment and a research tool. The more you use it, the more data they have to tune it. LLMs are presently one of the most expensive compute workloads that exists at this time. Even before the recent explosion, OpenAI was spending somewhere between $1-2M per day in compute costs. And the estimates just for training GPT-4o are in the hundreds of millions range. These models are massive and need a ton of really expensive hardware power to make them go. A super basic GPT-3 setup useful for a small handful of users is quoted around this specification: 4x compute nodes (probably with something like A800s) - $40-50K/ea 8x InfiniBand (200Gbps) per node - $16K/ea for cards, $20K for the switch Storage Server - $35K, plus $20K for connectivity infrastructure So, just for a small setup sized for a basic research lab, you're already in for $300+K of static hardware costs before enclosures, racking, cooling, and electricity. Will the costs come down? Incrementally, sure, but there are no magic wands to wave here nor any freebies to be had. A single VM of this configuration in Azure (before storage and bandwidth costs) is $20K/mo.
  12. New SpaceX EVA suits tested. First ever commercial space walk. 1400 kilometre orbit. Starlink uplink and downlink from space. Laser communication. Will be in the outer regions of Van Allen Belts. Radiation dose will be equivalent of 2 to 3 months on ISS. 40 research experiments. The suit has a built in heads up display.
  13. Thanks, good to know. Pre-oiling, warmup and mixture/temp awareness on the ground and in all phases of flight pays off, not surprisingly. Of course now that I've said that, something will probably blow up. 😄
  14. Just seen the announcement of the update. Tried to find videos of gliders towing and / or winch launch bit couldn't find any 😕 Any links welcome!!!
  15. I think it is economic. Having children and raising them is too darn expensive. It takes a much higher percentage of the family budget than it used to. For a few thousand years 90% of the population were farmers. They only were able to produce enough to feed themselves and the 10% in the cities. Children were the cheapest labor. There wasn't any labor saving farm machinery except horses, plows, etc. They had sheep and chickens. I know a family (they live off the grid) who had a few children. Then waited about 15 years and had a few more. The idea was the kids were to be farm hands and maids for the parents who lived in a very big and tall ranch house. They had no electricity. They did have a tractor and a car and a large fancy wood kitchen stove. They never hired anyone, the kids did the work. 80 acres. In modern times, kids are an expense not an asset financially.
  16. I have a problem with this website on my phone. On my PC works well, but most on my phone. Best regards Grzegorz
  17. Yeah, it was quite a journey LOL Revolut did the trick in the end!
  18. There's not a regulatory set of verbiage for pilots. The AIM has a lot of recommendations though. Those are all techniques though. Many are best practices and are what pilots use, but the AIM isn't the CFR. The OP could look here for some of those highly recommended and best techniques for USA flying: https://www.faa.gov/air_traffic/publications/atpubs/aim_html/chap4_section_2.html
  19. So far, I've got this: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1644264/pdf/procrsmed00338-0007.pdf https://eprints.lse.ac.uk/22514/1/2308Ramadams.pdf https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1501789/pdf/califmed00143-0080.pdf I'm sure there is more.
  20. I've found the same thing. If I view the knobs head on, I can only adjust the outer knob. That said, adjusting the viewing angle (I use ChasePlane) does allow me to manipulate the inner knob. It works for me if I look up at the knobs, kind of in a upper sideways view (if that's understandable).
  21. The fresh install took care of it. Sorry to bother ya. Thanks for a lovely product as always!
  22. Today
  23. From my testing I dont see how having traffic active will not become crazy expensive really fast. Those characters are flying by even when I`m solo and switching to basic in cruise. Could be that having premium for the whole flight with traffic will cost you more than a SayIntention sub. I`m preparing for that by using mostly basic.
  1. Load more activity
×
×
  • Create New...