Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Poor Performance using IHttpClientFactory with SocketsHttpHandler with Machine Name #38464

Open
scostaNextGen opened this issue Jun 26, 2020 · 5 comments
Labels
area-System.Net.Http backlog-cleanup-candidate An inactive issue that has been marked for automated closure. no-recent-activity
Milestone

Comments

@scostaNextGen
Copy link

I have an application that uses the IHttpClientFactory pattern. When I use a machine name for the host name and a new connection needs to be created, I was experiencing response times at around 4 seconds.

When I would make subsequent calls, after there was already an active connection to the host machine on the given port, my response times were very fast, around 30 ms.

It looked like SocketsHttpHandler end up looping over every IPAddress returned by Dns.GetHostAddresses. When failing to connect to an IP, there would be about a 1 second delay.

The easiest way I was able to reproduce this was as follows. First, when I called Dns.GetHostAddresses myself, I noticed that I had 6 IPAddresses returned and the first three were IPv6 addresses. My Kestrel listener at the time was only binding to IPv4 addresses by using .UseUrls($"0.0.0.0:{port}". So, I assumed that when the connection attempt was being made, it would start with the first IPv6 address, fail, go to the next, and so on. Once it reached the first IPv4 address, which was the 4th attempt, the connection was successful. When I changed Kestrel to bind to IPv6 and IPv4 addresses, the issue disappeared since its first attempt to connect using an IPv6 address was successful.

The issue also does not occur when I use an IP address, only when I use machine name.

I have resorted to calling Dns.GetHostAddresses myself, storing the result in a Dictionary with the host name as the key, and then looping over those values and sending the IP as the host instead of the machine name. Then if a particular IP fails, I can remove it from the array so that subsequent HttpClient requests use the IP that I know works instead of trying the same ones over and over again every time a new connection is needed even though they've failed before. And to prevent issues with DNS changes, any time I need to loop over the IPs, I call Dns.GetHostAddresses first to see if they changed. Something similar should be baked into the framework so that I don't need such an extensive workaround.

@Dotnet-GitSync-Bot
Copy link
Collaborator

I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label.

@Dotnet-GitSync-Bot Dotnet-GitSync-Bot added the untriaged New issue has not been triaged by the area owner label Jun 26, 2020
@scalablecory
Copy link
Contributor

In .NET 5 you will be able to plug in a custom resolver/connector that does this via #1793.

In .NET 6 we plan to implement a new algorithm to make scenarios like this connect significantly faster via #26177.

@scalablecory scalablecory added area-System.Net.Http and removed untriaged New issue has not been triaged by the area owner labels Jun 27, 2020
@ghost
Copy link

ghost commented Jun 27, 2020

Tagging subscribers to this area: @dotnet/ncl
Notify danmosemsft if you want to be subscribed.

@scalablecory scalablecory added this to the Future milestone Jun 27, 2020
@wfurt
Copy link
Member

wfurt commented Jun 28, 2020

This also looks like misconfiguration, not necessarily issue with HttpClient. Issues @scalablecory linked may help with mitigation.

Copy link
Contributor

Due to lack of recent activity, this issue has been marked as a candidate for backlog cleanup. It will be closed if no further activity occurs within 14 more days. Any new comment (by anyone, not necessarily the author) will undo this process.

This process is part of our issue cleanup automation.

@dotnet-policy-service dotnet-policy-service bot added backlog-cleanup-candidate An inactive issue that has been marked for automated closure. no-recent-activity labels Dec 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-System.Net.Http backlog-cleanup-candidate An inactive issue that has been marked for automated closure. no-recent-activity
Projects
None yet
Development

No branches or pull requests

4 participants