Windwos 7 (64Bit) multicast-udp video streaming shows random packet drops/losses; Windows XP do not show this behavior
I have a supermicro server with Windows7 (64 bit) installed on it. I have connected this system to an IPTV network where there are multicast-udp video streams flowing in the network. However when I join any multicast video stream from this windows 7 machine, I start to get random packet drops/losses on the received multicast stream. I connected the system to two different switches to study the behavior but they both show random packet drops. I also connected few XP machines to the same network at the same times and all the XP machines do not experience this packet loss I see on the windows 7 machine. If I connect this win7 machine directly to the streaming server and use directed unicast (no multicast) the machine do not experience any loss which seems to indicate that the network NIC (Intel 82574L; an onboard NIC) does not seem to be the culprit. Has anyone exprienced the windows 7 multicast and/or swith compatibility issues similar to the issue I am seeing here? Details: Computer: Dual 6core Intel Westmere processors NIC: Intel 82574L gigabit network card. (I did a driver update on the NIC, but the problem is still present). Thanks in advance for the help. -P. (Sorry for the repost if it comes as a duplicate as I did not see my previous post on the same issue in my thread list).-P.
November 19th, 2010 10:25am

Hi, First, I suggest you to disable your firewall software to have a try. While, I would like to share the following documents with you. Windows Media Server can generate UDP packets that are greater than 1500 bytes when you deploy high bit-rate content. This can cause IP packets to be fragmented for Windows Media Server streams because Ethernet packets have a maximum size of 1500 bytes. Fragmentation can cause performance degradation and packet loss. For details information, please kindly refer to the following link: http://support.microsoft.com/kb/252360
Free Windows Admin Tool Kit Click here and download it now
November 24th, 2010 3:48am

Hi Juke, Thanks for the reply and the link. The issue is present even when the firewall is disabled. I did more experiments on the system and found an interesting behavior on windows 7 which does not happen in XP. My experimental setup: A multicast server sending 3 separate multicast streams to an IGMP enabled switch to which my windows 7 unit is connected. There is no multicast router in the setup. When the sender is sending the 3 separate multicast streams (the switch floods the streams on all connected ports), if there are 3 receiver applications running on the win7 unit receiving the 3 multicast streams, then I do not see any packet drops on all the 3 streams. If I stop one receiver application, I start to see packet drops in the other two streams. This seems like if there are multicast packets being received by the NIC and/or operating system and if there is no consumer for the packets on the win7 machine, the network stack starts to drop packets for the streams. Anyone has any idea about who is responsible for the packets to be dropped if no one is listening for the multicast packets? The NIC or the OS? To me it sounds like Windows 7 issue. Any thoughts and/or pointer to fix or understand more on this issue? Thanks, P. -P.
November 24th, 2010 12:50pm

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics