> An application is being tested in a Windows XP Embedded SP2 guest machine
> under Virtual PC 2004 SP1 on a Windows XP Pro SP2 real machine. Virtual PC
> settings give the guest machine's COM1 control of the real machine's COM1.
> The application calls SetupComm with both buffer sizes set to 1200. In
> these tests the external device sends 1036 bytes at a time. Baud rate
> 57600, 8-bit byte plus parity, so each of these transmissions takes about
> 200 milliseconds. DTR is permanently enabled because the device requires
> it. But the device is not capable of flow control (if we disable DTR then
> we lose data instead of delaying it, and the idea of software flow control
> is far beyond the capabilities of the device).
> As background, on some real machines running Windows XP Pro SP2, the
> application is getting CE_OVERRUN. We're trying to figure out what to do
> about that. But at least we know where it's happening. (Some of our
> experiments try a USB-serial adapter with a hardware receive buffer that's
> either 384 bytes or 4K bytes, we're not sure which, but we're still getting
> CE_OVERRUN.) Anyway, we've never seen CE_RXOVER with this application on a
> real machine.
> Under Virtual PC the application sometimes gets CE_RXOVER instead of
> CE_OVERRUN. Where can this be happening? Does Virtual PC set a buffer size
> that's smaller but just happens to be good enough 99% of the time, and then
> somehow reflect its own error into the guest machine's serial status? But
> doesn't the guest machine think the serial port is a real one (and maybe
> isn't it really the real one), so how could it be getting hit with a
> software error that isn't occuring in the guest machine?
> In this setup, where can CE_RXOVER possibly be coming from?
Posted via http://www.VirtualServerFaq.com
- Brought to you by Business Information Technology Shop - http://www.bitshop.com