The Pure Storge FlashArray supports the Fibre Channel (FC) connectivity. Setting up FC connectivity to the FlashArray is very straightforward. Fibre Channel configuration for connectivity is within the fabric of the infrastructure and does not involve protocol configuration on the host like iSCSI.
The screenshot below illustrates four connected FC Ports (CT0.FC0/FC1 and CT1.FC0/FC1) on a Pure Storage FlashArray//M20. These ports will be configured using Step 05.2 -- Setup FC Connectivity. Step 05.2 references this current article.
The following steps will illustrate how to connect to the Pure Storage FlashArray FC ports using the Windows Server Disk Management tools.
In the previous Windows Server Best Practice Guides it was noted that Emulex HBA Queue Depth and NodeTimeOut settings should be changed from their defaults. This is no longer recommended. These settings where used with the earlier versions of Purity OE and the FlashArray 300/320.
Setup Host, WWNs and Volume Connectivity with FlashArray Management Tools
This section walks through the steps for configuring FC using the Graphical User Interface (GUI) tools provided by Pure Storage FlashArray management interface.
Configure FlashArray Host, WWNs and Volume
- Open the Pure Storage FlashArray Management interface and log into the FlashArray.
- Click on the Storage tab.
- Click on the + in the Hosts section and select Create Host.
- Select the newly created host, Server01, then click the Host Ports tab. Click the 'hamburger menu' to the right, then click Configure Fibre Channel WWNs.
- The Configure Fibre Channel WWNs for Host dialog box will open. Select the WWNs from the Existing WWNs list. To identify the proper WWNs for a host refer to Retrieve World Wide Names (WWNs) on Windows Server.
With a properly configured fabric that has a Pure Storage FlashArray connected the WWNs for all the hosts connected to the fabric will be displayed in the Existing WWNs list. If they do not show up and it is necessary to enter them manaully refer to Retrieve World Wide Names (WWNs) on Windows Server.
6. Click on the + in the Volumes section to create a volume. For this example the name FC-TestVolume with a size of 500 G is being used. A different name and size can be used.
7. After creating the new volume click the Connected Volume (0) tab then the 'hamburger menu' to the right and select Connect Volumes menu item.
8. The Connect Volumes to Host dialog will open. Click on the FC-TestVolume (or whatever volume name was created) then click Confirm.
9. Now the new host, Server01, is connected to the new volume, FC-TestVolume, with the host WWNs configured.
10. Open up the FlashArray Management interface, click System tab, click Connections, click Host Connections and select host that was just configured. The Host Port Connectivity should show Redundant connections.
Configuring Volumes with Windows Server
Depending on which version of Windows Server that is being configured refer to the appropriate reference below.
To test the connectivity from the host to the FlashArray you can use DISKSPD for a basic plumbing test. DISKSPD is a storage load generator / performance test tool from the Microsoft Windows, Windows Server and Cloud Server Infrastructure Engineering teams.
DISKSPD is not recommended for performance testing. The use case mentioned here is to simply test the connectivity to the FlashArray.
Running diskspd with the below example command line will generate I/O to evaluate connectivity. The <DRIVE_LETTER> in the command line should be the drive letter of the newly connected volume. To learn how to setup a drive letter for a newly connected volume see Working with Volumes on a Windows Server 2012, 2012 R2 or 2016 Host.
.\Diskspd.exe -b8K -d3600 -h -L -o16 -t16 -r -w30 -c400M <DRIVE_LETTER>:\io.dat
The results of the plumbing test should generate similar output as below.
The host can also be monitored using the Purity CLI with the pureuser account with the below command.
pureuser@myarray-ct0:~# purehost monitor --balance Name Time Initiator WWN Initiator IQN Target Target WWN Failover I/O Count I/O Relative to Max Server01 2017-06-07 09:30:06 PDT - iqn.1991-05.com.microsoft:server01 (primary) - - 500187 99% iqn.1991-05.com.microsoft:server01 (secondary) - 506741 100%