SQL Server TCP and UDP Ports
SQL Server TCP and UDP Ports
RECENT
FEATURED
Microsoft Unveils Azure Backup for SQL SQL Server has evolved from a simple relational database engine to a multipurpose
Server
enterprise-level data platform. The subsystems and features that Microsoft has added—and
MAR 21, 2019
continues to add—to SQL Server have significantly increased the network connections that
the platform uses. Sometime it's tricky to figure out which firewall ports to open for each
Real-Time Threat Intelligence Service
Aims to Shut Down Phishing
This websiteSQL
uses cookies,
Server including third party
feature. ones, to
To help allowhere's
you, for analysis of how peopleof
a rundown use
commonly × used SQL Server network
our website in order to improve your experience and our services. By continuing to use our
MAR 19, 2019 ports.
website, you agree to the use of such cookies. Click here for more information on our Cookie
Policy and Privacy Policy.
TCP 1434
TCP port 1434 is the default port for the Dedicated Admin Connection. You can start the
Dedicated Admin Connection through sqlcmd or by typing ADMIN: followed by the server
name in the SSMS Connect to Database Engine dialog box.
UDP 1434
UDP port 1434 is used for SQL Server named instances. The SQL Server Browser service
listens on this port for incoming connections to a named instance. The service then
responds to the client with the TCP port number for the requested named instance.
TCP 2383
TCP port 2383 is the default port for SQL Server Analysis Services.
TCP 2382
TCP port 2382 is used for connection requests to a named instance of Analysis Services.
Much like the SQL Server Browser service does for the relational database engine on UDP
1434, the SQL Server Browser listens on TCP 2382 for requests for Analysis Services named
instances. Analysis Services then redirects the request to the appropriate port for the named
instance.
TCP 135
TCP port 135 has several uses. The Transact-SQL debugger uses the port. TCP 135 is also
used to start, stop, and control SQL Server Integration Services, although it is required only
if you connect to a remote instance of the service from SSMS.
TCP ports 80 and 443 are most typically used for report server access. However, they also
support URL requests to SQL Server and Analysis Services. TCP 80 is the standard port for
HTTP connections that use a URL. TCP 443 is used for HTTPS connections that use secure
sockets layer (SSL).
Microsoft uses TCP port 4022 for SQL Server Service Broker examples in SQL Server Books
Online. Likewise, BOL Database Mirroring examples use TCP port 7022.
This summary should cover your most pressing port needs. You can find more detailed
information about the TCP and UDP ports that SQL Server uses in the Microsoft article
"Configure the Windows Firewall to Allow SQL Server Access."
0 COMMENTS
RELATED
Free Microsoft Training Is as Close as Your Getting Started with Azure Data Studio and
Mobile Device SQL Server 2019 Preview
MAY 01, 2019 MAR 28, 2019
How to Prepare for SQL Server 2008 and Examining Scalar Function Performance in
2008 R2 End of Support SQL Server
MAR 28, 2019 FEB 24, 2019
Microsoft SQL Server provides a number of ways for data professionals to glean knowledge
of the inner workings of the platform: From Dynamic Management Views (DMVs) and
Functions (DMFs) to Extended Events, Catalog Views and an endless supply of third-party
tools, you can find out pretty much anything about the internals of SQL Server if you know
where to look.
I’ve written extensively about Dynamic Management Objects (DMOs), and each new release
of SQL Server brings with it new DMOs. The one we’re focusing on today has been around
since Microsoft SQL Server 2016: dm_exec_function_stats.
As is the case with its cousins, dm_exec_function_stats has a significant number of output
columns that cover the classes of key indicators you’d expect when performance tuning
items in SQL Server:
The existence of sql_handle and plan_handle in this DMV allows you to join to other DMOs
to return the defining code for the function and see the graphical execution plan. You can
also use system functions like dm_name() and object_name() to make columns such as
database_id and object_id more user-friendly when looking at results.
Take, for example, the following query to look at the state of user-defined scalar functions
for a given instance/database sorted by duration. By default, this query returns results for
the currently in-scope database through the WHERE clause I’ve coded:
, eFS.execution_count
-- CPU
-- ELAPSED TIME
-- LOGICAL READS
, eFS.min_logical_reads AS [min_logical_reads]
, eFS.max_logical_reads AS [max_logical_reads]
, eFS.total_logical_reads/ISNULL(eFS.execution_count, 1) AS [avg_logical_reads]
-- PHYSICAL READS
, eFS.min_physical_reads AS [min_physical_reads]
, eFS.max_physical_reads AS [max_physical_reads]
, eFS.total_physical_reads/ISNULL(eFS.execution_count, 1) AS [avg_physical_reads]
-- LOGICAL WRITES
, eFS.min_logical_writes AS [min_writes]
, eFS.max_logical_writes AS [max_writes]
, eFS.total_logical_writes/ISNULL(eFS.execution_count, 1) AS [avg_writes]
, eFS.last_execution_time
, eST.text AS [procedure_code]
, eFS.[plan_handle]
I’ve broken the results up here into logical “chunks” and also to make what is a wide result
set easier to read:
The query is easily adjustable to focus on just those functions you’re interested in based on
the options for sorting by duration, reads, writes, CPU, etc., depending on which line you
choose to uncomment in the ORDER BY clause.
Another thing to consider when looking at the results that come from these DMOs is the
unit of measure that is appropriate. You’re not likely to measure a garage in millimeters to
see if your new car is going to fit. You’re likely going to use a unit of measure more
appropriate for the job. The same goes with functions compared to procedures or ad-hoc
query executions. Duration in the various columns provided by these objects is measured in
microseconds--1/1,000,000 of a second. When I’ve written queries using
dm_exec_procedure_stats and dm_exec_query_stats for analysis I’ve always reduced that
unit of measure by a factor of 1,000 to measure duration as milliseconds--1/1000 of a
second. This matches the default unit of measure for performance benchmarks for queries.
Since functions are but a component of a larger query (be it in stored procedure or ad hoc
form), I leave the unit of measure as microseconds.
In examining the results, the first thing that jumps out to me is the range of execution
duration times between the average and the max values. I’d be really curious to dig into
these results further to ascertain why performance is (possibly) inconsistent. I find this
query valuable when attempting to identify functions that may warrant tuning attention. It's
also valuable for zeroing in on performance improvements, where I’m seeing them through
my analysis of stored procedures or ad-hoc sql where functions play a role.
dm_exec_function_stats is another one of those tools to keep on hand when analyzing
performance in your Microsoft SQL Server instances.
RELATED
Talking T-SQL (among Many Other 4 Online SQL Server Help Resources for DBAs
Enterprise SQL Server Topics) ('Accidental' and Otherwise)
DEC 31, 2018 DEC 20, 2018
Next Article