Jump to content

martinboehme

Members
  • Content Count

    573
  • Donations

    $0.00 
  • Joined

  • Last visited

Everything posted by martinboehme

  1. Ron,you don't need a course setting to be able to display an ILS localizer deviation. See here for an explanation of how it works:http://en.wikipedia.org/wiki/Instrument_Landing_SystemMartin
  2. Geoff, don't know about the 737NG, but AFAIK, at least on some types, the course is used during autolands in a crosswind so the autoland system knows the runway heading to "kick it straight" to. At any rate, it's good practice to set the right course so the display on the EHSI is correct etc.Cheers,Martin
  3. Bob,thanks! No problem, just wanted to know if I was doing something wrong (which is the usual reason for something not working the way I expect it to ;-) ).Martin
  4. > That 13600 not about the field legnth needed to take off. It's the field legnth needed to stop the beast if something goes wrong before V1.Or to accelerate the aircraft to Vr on only three engines if one quits after V1...Martin
  5. Today, I tried using the DME HOLD function, but without success (this is with V 1.1). I tuned the frequency of the VOR/DME I wanted to hold on NAV1, and the DME readout was displayed in the HSI. I then switched the switch at the bottom left of NAV1 to DME HOLD, and the green light came on. I now tuned a different frequency, expecting the DME display to be held, but the DME display went blank.All of this was with the switch on the right hand side of NAV1 set to "NORM". I tried the same thing with the switch set to "OVRD" (what is this switch supposed to do, by the way?), but this did not make any difference.Am I doing something wrong, or is this feature just not implemented?Cheers,Martin
  6. The tip tank is only there as a dummy to let the panel know it's being used with the Terry Gaff model (as opposed to the FFX model).Martin
  7. > I won't speak publicly about it and I even consider I've given enough> clues in letting people know that there is no "hack" to do it and it> is directly documented in the gauge.h.Roger.;-)Martin
  8. > This is terribly off-topic for the threadApologies by the way for taking the thread so far off topic...> I heard an "aviation expert" interviewed on the radio this morning> claiming that it was actually more common than you would expect for a> plane to taxi to the wrong runway, however, it gets caught either by> ground or tower or the pilots themselves, or if they do take off, they> are able to maintain the climb.Agreed that there are probably quite a few more cases that get caught by the checks and balances -- well, that's why they're there! My original estimate referred to the number of takeoffs from wrong runways that actually take place without being detected until after the fact -- no idea if that number is even in the right ballpark, but I would guess it's pretty small... thanks to all the checks and balances built into the system.Martin
  9. > But even in my simulated experience I can't imagine how this can happen. I have seen the charts for the airport and it's beyond me how you can mistake the 26 for the 22.I've wondered the same thing with similar accidents. How could a pilot take off from the wrong runway?(I'm not offering a premature conclusion about what happened here... but there have been a number of accidents in the past where pilots took off from the wrong runway... some at day in VMC... and the question is: how on earth could something like that happen?)My personal explanation for this is: it's actually a very, very rare occurrence. I don't have any numbers on this, but let's assume that the rate of takeoffs from wrong runways in commercial aviation is one in a million takeoffs (my hunch is that, in reality, takeoffs from wrong runways occur even more rarely than that). Now, there are a lot of things that I could hardly imagine myself make a mistake doing. But when it gets to the point where I would have to do something _a million times_ and be allowed to make only one mistake -- and bet my life on it -- well, there are few things that come to mind where I would be sure that I could accomplish this feat.Do a million takeoffs, and there may be one time when all of the holes in the cheese line up -- and you takeoff from the wrong runway. Of course, this is not to say that no efforts should be made to try and prevent even such rare mistakes...Martin
  10. > There's a DirectX pointer that can be accessed... the question is,> exactly what is it pointing to? If I knew, I'd tell you.Makes sense, though... it's probably the surface that FS renders the gauge to. If one knew how to use it, one could do what Jean-Luc implies... well, thanks for that tidbit of information. Maybe I'll get round to investigating this myself one day...> However, don't expect Jean-Luc to speak publicly about it.Well, I wouldn't either if I were him... ;-)Martin
  11. Jean-Luc,thanks for the clarification about the "psycho-visual filtering". Impressive -- it wasn't clear to me indeed from the description on the website that you were actually simulating the display characteristics of a CRT. That's a point that might indeed be worth emphasizing on the website.Interesting stuff too about rendering to a backbuffer that FS can use directly... hmmm, seems I'll have to go rooting through gauges.h and see if I can figure out how it's done... ;-)Great thread -- I've learned a lot!Martin
  12. Pardon, j'avais oublie ton post... merci pour l'explication, maintenant c'est clair!> Thank you also for your feedback on the lack of "marketing" about TDXP!> Inspite of many information on the website, it seems the market reads> "GDI+ = smooth" while I'd prefer it reads "TDXP = fluid" ;-)Actually, I didn't mean to imply that there was a lack of marketing... just that all of the information about TDXP on the website was shrouded in "marketing speak", so it was kind of hard to tell what TDXP really is. Check this, TDXP has "an advanced psycho-visual filtering stage"... wow! ;-) Not intended as criticism, quite the contrary... that's the kind of stuff that's _supposed_ to be on a website. ;-) Just makes it hard for inquisitive types like me to find out what TDXP really does...Your performance numbers sound pretty impressive... I've thought about getting your FlightLine N/T gauges on several occassions... it's just that I don't do a lot of GA flying (I'm more of an airline type).Martin
  13. Frit,I'm not worried, because I'm not shooting for the prime time... Weekday afternoon programming is good enough for me. ;-)Martin
  14. > The point of my posts is to ensure that people clearly understand it> is not a GDI+ replacement. It's a graphics API, it renders faster than> GDI+, but it is not a replacement. Even the AggPlus library isn't good> enough to be called a replacement. It has too many critical functions> missing.Absolutely. I still intend to evaluate it because if it's good enough for some of the things I want to do and does those things faster than GDI+ does -- great! If it turns out AGG doesn't suit my purposes -- well, I'll still have learned something. Seems you did just this and came to the latter conclusion -- thanks for sharing your insights.Thanks also for pointing out the speed issue with text -- that seems to be a major caveat, maybe there's a workaround (pre-render the font to a bitmap, that kind of thing...), maybe not... and even if there is, maybe it's just too much of a hassle to justify doing it.In the end, the good thing is that I get to choose... I can use AGG for those things that it's good at, and GDI+ for the things that _it_ is good at. And if Jean-Luc decides to make TDXP availabe, we'll have even more choice.About subpixel resolution... agreed, there are some cases where it doesn't help a lot, and ultimately, what counts is whether it looks good -- and helps us fly the aircraft! In some situations, I find subpixel resolution can be a very helpful -- I find my pitch control is better on ADIs that do subpixel rendering, and I'm grateful that GDI+ and AGG let us implement these kinds of gauges.Martin
  15. Hey guys, this has been a great thread... please don't turn it into an argument about who has been programming professionally for longer. ;-) Ed gave us an open, no-holds-barred account of his opinion and experiences with GDI+ and AGG... and Frit, I think you interpreted that as a dismissal and overly negative putdown of your enthusiasm for AGG, which I don't think is how it was intended. At least I didn't interpret it that way...Frit, Ed, thank you to both of you for sharing your experiences with AGG and GDI+. I've learned a lot from this thread -- and I hope there's plenty more to come!Martin
  16. No question... there ain't no such thing as a "subpixel"... show me a monitor that has them and I'll buy it. ;-) But there is such a thing as "subpixel resolution", and it's useful... (at least to me).Consider the case where we want to draw a square with one-pixel sides, and those sides are aligned exactly with the pixel boundaries. Let's say we want the square to be drawn at a brightness value of 200, on a black background. This is the image we get: 0 0 0 0 200 0 0 0 0(Sorry, can't get the pixel columns to align... how do I switch to a monospaced font?)Now let's say we shift the square right by half a pixel. This is the result that we have now: 0 0 0 0 100 100 0 0 0(One way to get to this result would be to render the image at double the resolution originally, then average over blocks of 2x2 pixels.)Some people call this "antialiasing". Some call it "subpixel resolution". No matter what you call it, the interesting thing is: Moving the object that you're rendering by less than a pixel results in a change in the resulting image... and a useful change at that, because it gives us information about the position of the object to a resolution of better than a pixel. Say we got this image: 0 0 0 0 50 150 0 0 0No prizes for guessing where the square is now...The useful thing is that our eyes are pretty good at interpreting these kinds of "subpixel effects", too. Why? Well, our eyes have limited resolution too -- the retina is made up of discrete light-sensitive cells, after all -- and so we have the same "subpixel effects" going on in our eyes all the time.(Side note: These effects are also used in computer-based image analysis. At work, I'm currently working on building an eye tracker -- a device that measures where you're looking -- and we can measure the position of the user's pupil in the camera image to a precision of better than one pixel.)The nice thing is that there's an objective definition of the result that an ideal "antialiasing" or "subpixel rendering" algorithm should produce. It's the result that you would get by drawing the object at infinite resolution, then laying a pixel grid over it and determining what percentage of each pixel is covered by the object we want to render. That percentage then corresponds to the brightness value that should be assigned to the pixel.(Side note to any signal processing gurus who may be lurking in the wings: Yes, this is just a simplified illustration. Yes, I'm using a rect window here, which will lead to leakage and oscillations in the spectral domain. Maybe I should be using a sinc... possibly windowed using a Blackman window... hey, what's this I read in the AGG documentation about sincs and Blackman windows? Hmmm... ;-) )Pretty much the same thing happens, by a physical process, when we take a picture of an object using a digital camera -- if there are hard edges in the object, those will show up as "antialiasing" in the camera image. In fact, we can use this as a "gold standard" for a rendering algorithm: The result of rendering any object should ideally be the result we would have obtained by taking a picture of the same object using a good digital camera. (Talk about "photoreal" ;-) ) Of course, any rendering library, such as GDI+ or AGG, will only implement an approximation of this "ideal"... but the closer it is to the ideal, the better the quality of the rendering. How GDI+ and AGG stack up in this comparison, I don't know...Martin
  17. JeanLuc,> Bien sur, tu lance une telle perche! :-)Je sais que ca c'est ta conversation prive avec Eric ;-), mais... "lancer une perche", qu'est-ce que cet idiome veut dire?> Thank you for your confidence in TDXP Eric. It is not ready for prime time yet, but I'm definitely considering this.You would consider releasing TDXP? Wouldn't this erode a lot of Reality XP's competitive advantage?Thanks by the way for the information on TDXP... I had always been wondering exactly what kind of technology was behind that name, but I was never really able to tell from the marketspeak on the Reality XP website ;-)Martin
  18. Thanks for the illustrative example... this assumes that one "unit" of offset translates into one pixel of movement on the ILS needle... which implies a gauge width of around 250 pixels. Since many gauges aren't quite as wide as that, the situation in reality would be even worse... so the limitation isn't the resolution of the ILS offset after all...> Its kinda of hard thinking sub pixel, but believe me, you will actually notice it.Oh absolutely, no question... that's why I'm so eager to try out AGG.Martin
  19. thanks for the info.> I have been told by many people over the years the biggest complaint they have with gdi+ screens beside it killing performance big time is the not so accurate movement of say the HSI needles etc.Recently, I wrote a GDI+-based ADI, and I noticed that the ILS needles didn't move quite as smoothly as I thought they should (and not as smoothly as the artifical horizon, for instance). My impression is that there is an additional issue here because the localizer deviation variable can only take on integer values between -127 and 127... and that's coarse enough to (just) be noticeable. I think the same problem could apply to HSIs since it's the same variable.I briefly contemplated writing some code that would take the coordinates of the localizer (accessible as token variables, IIRC) and of the plane and then use those to compute the deviation myself (with arbitrary precision)... but decided it wasn't really worth the hassle...MartinP.S. Great thread by the way, lots of useful info!
  20. > I'm sorry to say it is still confidential, even if some people on this forum already know about it ;-) I'll tell you more as soon as I can.OK, thanks... looking forward to hear more (knowing your work, I expect it will be good..)Martin
  21. Eric,OK, thanks for the clarification. I'll have to check for myself then. But even if it doesn't support this feature, it shouldn't be too much of a problem to put a layer on top of AGG that transforms the line widths...> Hopefully you will soon have the opportunity to see screenshots of the project I am working on, and you will see what I am talking about.Hm... sounds good. What are you working on? Or is that a secret?Martin
  22. Eric,wow, that was a quick answer... ;-)So does this mean that AGG doesn't support transforms the way GDI+ does, or is line width simply not scaled when you apply a transform?BTW, thanks for your A320 and F-16 gauges!Martin
  23. Frit,fascinating stuff... Thanks for doing that sample gauge, I'll have to try it out. Came across AGG a while ago, had a look at it, and it looked good, but never got round to trying it out. Now you've convinced me that I'll have to take a look at it...> I've run timers on renderings with the SD and from the time it clears the buffers, renders the entire scene and blits the resultant bitmaps was around 0.00006 milliseconds..... which is 0.06 nanoseconds according to the timer but I found that hard to believe it could calculate that fast....I find that hard to believe, too ;-) I think your measurement is going wrong somewhere... At a clock rate of 4 GHz, one clock cycle is 0.25 ns... so AGG is rendering the scene in a quarter of a clock cycle? I'm willing to believe it's good, but not that good... ;-)But AGG certainly sounds like an interesting proposition. I see a SVG parser is included, too... that might be interesting for complex display elements... you could design those in an SVG editor.Thanks for the info!Martin
  24. Ed,> But having to change the thickness or the position to get proper clarity is a workaround. Requiring a workaround is an indication of a flaw.[snip]I have to admit I haven't tried AGG myself... but I find the examples I saw here very convincing:http://www.antigrain.com/doc/introduction/...oc.html#toc0005(Scroll down to "Lines Rendered with Anti-Aliasing and Subpixel Accuracy", for example. Even the lines with subpixel width look very clear on my dispaly. I find the spiral very convincing, too.)> If the rendering routine results in lines that appear so thin that at 1600x1200x32 they're practically non-existant... there's a problem. AGG does this...I think the problem may be that even straight single-pixel lines can be hard to make out at that kind of resolution... Have you tried taking a screenshot, then blowing it up to see what the image really looks like?I guess the solution to this is probably that line width (in pixels) should scale with the size of the gauge (in pixels). On a 1600x1200 display, this would result in a greater line width than on 1024x768 (which makes sense, since the gauge occupies more pixels). Strictly speaking, this should be done anyway so the lines are always proportionally the same width relative to the size of the display...Anyway, the proof is in the pudding... So I guess I'll just have to try out AGG myself.MartinMartin
  25. Michele,no load and trim sheets, I'm afraid, but if you're looking for the definitive info on the pallet positions, there's some good information here:http://www.boeing.com/commercial/airports/737.htm(Look under Section 2, "Airplane Description")Martin
×
×
  • Create New...