Setting up new R710 servers to upgrade existing 2008R2 Hyper-V to a 2012R2 Hyper-V and current network was inherited and is very basic and all physical using default layer 2 switch VLAN (1), no vSwitch in Hyper-V just a single PowerConnect 5448 .
I am having trouble getting VLAN connectivity in the upgrade even tested right from the switch to the Hyper-v vNICs once I enabled VLANS in vNICs in Hyper-V. My objective is to create a few vNICs on each host that are utilizing a large team of 6 pNICs (mgmt., backup, live migration). Teamed using native 2012R2 with NO VLANs on the Team. Created a vSwitch then created vmnetworkadapter in Powershell and assign tagged VLANs to the management vNICs accordingly, while the VM traffic uses the native (default VLAN1) then on the Powerconnect 5448 we are using LACP for the Teams and I trunked the associated LAG channel #3. I did not do anything on the individual ports that make up the LAG believing the LAG VLAN config superseded any of that. Do not pay attention to DMZ Team, it is future once this gets working to allow a few limited VM's to run in the DMZ, while also having the ability to live migrate, backup etc. DMZ is a physically separate switch, hence the separate team and dedicated pair of pNICs.
Here is my configuration, please advise.
Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.
Hostmanagement IP - 192.168.2.9/24
LiveMigration IP - 172.16.16.5/24
Backup IP - TBD (Determine once I figure out this problem)
Upgrade Test VM - 192.168.2.131/24 (Using untagged native VLAN1)
Dell PowerConnect Switch Mgmt Interface (VLAN1) - 192.168.2.254
Upgrade server #2 used for testing on g29 switch port - 192.168.2.71
Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.
CH3 is made up of 6 ports (6,7,9,12,13,14), g29 is a test server in the mix. Show VLAN has CH3 and G29 both having VLAN 1 & 20.
Image may be NSFW.
Clik here to view.
From the powerconnect switch I can't ping the hostmanagement IP 2.9 and this 2.9 host has no connectivity at all from it or to it. I know there is some IP overlapping, but with VLANs it shouldn't be an issue. Ideally I plan on fixing this, but I am remote to the servers and having a hard enough time with this one using the iDRAC. Haven't tried the LiveMigration vNIC yet bc I couldn't get management to work. 2.9 worked fine until I applied the VLAN in Hyper-V via powershell.
2.9 can't ping the VM 2.131 which I would expect since they are on different VLANS.
I believe the configurations are all correct, but my testing is flawed, please enlighten me.
Also if there is an easier way for me to test this new implementation, please advise.