Monday, January 27, 2020

PSscript to get list of share and perform test on them

The Problem
In my environment, I got about 200 VM and some of the servers physical servers and others in a remote site. One of the critical security issues is the open share, where the users can Read/Write/Delete data, usually, these shares are created due to a misconfigured share, or shares that are intended to be temporary and then suddenly these shares became critical and placed for production.
Power users who have the ability to create share and manage permission may forget to set proper permission, sometimes they follow the easy way of Grant Everyone everything in both NTFS and share, which expose all the files to all the users in some cases these files are critical files.
There are several scripts on the Internet to search for share folder in the servers, but usually the result will be slow, incompatible, or with a lot of noise, such as Admin share. Also, another challenge is even after getting the share list, how can the admin confirm if a user can Access/Read/Write/Delete files in the shared folder without going though the long list of ACL for both share and NTFS?!

Powershell Script
To address this issue, I created a Powershell script that will do the following
  1. Get a list of all AD Computer using filter operatingsystem -like "*server*" so only servers are retrieved.
  2. Test the computer reachability and ensure that it's responding
  3. For each server matching the filter, get the share list using WMI query. The share list only include the Folders and Admin Share such as Admin$, C$, IPC$ ..., excluding printers
  4. The Script can read cluster share
  5. If Impersonation parameter is set to $true then the script will request to use another user credentials to perform Access,Write, Read, Delete on each share, usually these credential are for a limited user:
    • Create a PSDrive and assign any available letter to the drive, this will use any available letter. At least make sure that you have 1 available letter :)
    • If the Map was success, then the users can Access and the Access is set to Yes, else, all the other test will stop as I assume that if the user don't have the right to access a folder so this user wont have write, read or delete access permissions.
    • If the Access was success, then the script proceeds in Writing a file to the destination as AdcciTestPSWrite.txt
    • If The Write was success, the script will proceed in Reading it.
    • The last step is to remove the created file, and the result will be updated
      • The script will not remove random files, it will only try to remove the file it wrote as part of the test.

  1. The script may report that the user has full access (Access, Read, Write, Delete) in a folder where Creator Owner has full control, such as Folder Redirection, Home Folder. I will fix this issue in the coming version. But for now its good :).
  2. There might be a duplicated result when the file server cluster have muti rules. also will work on fixing this issue on the next version. 
The output is a CSV file, you can format it in the way you want that show the folder name and a table of the permission the impersonated user can do

Download the scrip, try it and let me know

You can reach me

Friday, March 15, 2019

Could not find this item

Recently I noticed some wired behavior of my test file server.
Whenever I remove a folder, the folder icon remains there, even though it deleted, but it's still showing.
if I double click on it, the folder open and then automatically exit "Go back one level up".
If I try to remove the folder again, I got the message below
Using Sysinternal Procmon.exe, I got the following message when trying to remove the folder. NAME NOT FOUND.
I noticed also using the CMD the folder name is ending with white space.
My folder name is VDI1, using the CMD with auto-complete I got this:
rd "vdi1 "

To remove the file use the following command to remove the folder
rd "\\?\d:\Shares\Home\vdi1 "
use the TAB while typing the path for auto-complete, as the end of the folder contains white space.
using this way. I was able to remove the folder.

Friday, March 1, 2019

Get MyDocument and Desktop Folder size for all computer in the network

This Script will read the computer list from a CSV file and scan all the computer in the list to get the current size for MyDocumet and Desktop folder in the remote PC.
This script is very useful to size your VDI environment and get an idea about the expected user profile, especially when all the users are using regularly PC and storing all their content on the desktop and MyDocuments
I chose to get the information from the CSV file as I request from the network team for the most recent computers list which is connected to the network. the active directory can contain some old computer object or test pc, which will increase the percentage of failed scan. Anyway, if you prefer to get the list from AD you still can do it by using 
Get-ADComputer -filter * | Export-Csv -NoTypeInformation -Path c:\temp.csv

# This Script is written by Faris Malaeb
# Feel free and contact me,
#This script will scan the network and get the users desktop and mydocument profile size
#This Script will help you in giving you an idea about the users data
# I wrote this script as I needed to know how much users have in their desktop and MyDocument folder
#These data should be moved to the servers as a part of VDI Project implementation.
# Get the computer objects
# Tested on Windows 10
#To use the Script .\Get-DesktopnDocs.ps1 -FullPathForSoruce C:\PCList.csv -ResultDestination C:\MyFinzalResult.csv
#You can use the -Verbose parameter for more information during the runtime

$Collection=Get-Content -Path $FullPathForSoruce
#The final result will be added to this array

    Write-Host "Validating the destination path $ResultDestination" -ForegroundColor Green
    Add-Content -Path $ResultDestination -Value "temp"
    Remove-Item $ResultDestination
    Write-Host "Failed to write to path $ResultDestination... Exiting " $_.exception.message

#Looping through all the computer object
foreach($SinglePC in $Collection){

#Checking if the computer is alive or dead
    try{ (Test-Connection -ComputerName $SinglePC -Count 1 -ErrorAction Stop |Out-Null)

        #I need to have an object which contain ComputerName, Deskop, MyDocument and Username properties
        #I will name this object as TotalResult
        $totalresult =New-Object psobject
        $totalresult | Add-Member -NotePropertyName ComputerName -NotePropertyValue ""
        $totalresult | Add-Member -NotePropertyName Desktop -NotePropertyValue ""
        $totalresult | Add-Member -NotePropertyName MyDocument -NotePropertyValue ""
        $totalresult | Add-Member -NotePropertyName username -NotePropertyValue ""
        $totalresult.ComputerName= $SinglePC

        #Each properties of the newly created object will contain the output from a command invokation
        #Getting the username of the active user from the remote machine.
        #I am getting the username through reading the running process which run under the user context.
        #My Domain name is Domain-local, and as the client OS only support one login, so all the process should have the current active user
        #I used the $env:USERDOMAIN System Variable to get the domain name, lets assume that my domain is named as Domain-Local
        #After getting the list of processes running under the username Domain-local\..., I select the first Item in the array and removing the domain name
        #This will only return the username, and then store the result in the $TotalResult.Username Properties
        ((Get-Process -IncludeUserName |where {$_.username -like "$env:USERDOMAIN*"})[0]).username.substring(([int]$env:USERDOMAIN.Length+1))
        $totalresult.username=invoke-command -ComputerName $SinglePC -ScriptBlock $ScriptToExecute

        #The command which will be invoked on the remote computer
        $ScriptToExecute= {
        $UserNViaProcess =$args[0] #read the argument that will be passed to the script block through the invoke-command -ArgumentList property
        #The Lines below will get the Desktop items which have the extension below.
        #the Measure-Object will calculate the value a group based on numeric property, and in our case the value I want to measure is the Length
        #After the calculation I devide the value by 1GB to get the result in GB
        ((Get-ChildItem -Path "C:\users\$UserNViaProcess\Desktop" -Recurse -Include @("*.docx","*.xlsx","*.pptx","*.txt","*.mmp","*.jpg","*.png","*.pdf","*.vsdx")|Measure-Object -Sum Length).Sum /1GB)
        #Executing the remote command invoke and passing the arguament list, which is the username
        $totalresult.Desktop=invoke-command -ComputerName $SinglePC -ScriptBlock $ScriptToExecute -ArgumentList $totalresult.username

        $UserNViaProcess =$args[0]
        ((Get-ChildItem -Path "C:\users\$UserNViaProcess\Documents" -Recurse -Include @("*.docx","*.xlsx","*.pptx","*.txt","*.mmp","*.jpg","*.png","*.pdf","*.vsdx") |Measure-Object -Sum Length).Sum /1GB)
        $totalresult.MyDocument=invoke-command -ComputerName $SinglePC -ScriptBlock $ScriptToExecute -ArgumentList $totalresult.username
        #Display the result in the console
        #Adding the Result to the array


    Catch {

            Write-Host "I failed on " $SinglePC " Error is " $_.exception.message

    $FullResult | Export-Csv $ResultDestination -NoTypeInformation


Tuesday, November 13, 2018

Cloud Witness option is disabled in Failover Cluster

I had my SQL Server Cluster AG up and running and using the local File Witness Sharing, but later on I decided to move witness to the Azure cloud as I need to expand my cluster to a DR Site.
When I try to change the cluster quorum setting to I got this
There is no indication why, nothing in the windows event log, or cluster log.
I tried to use Powershell to change the setting using the command in the following URL

Set-ClusterQuorum -CloudWitness -AccountName -AccessKey
But I got the following error message

Set-ClusterQuorum : The current cluster functional level does not support cloud witness. Please update your cluster
nodes to Windows Server 2016.
At line:1 char:1
+ Set-ClusterQuorum -CloudWitness -AccountName -AccessKey 

In Windows 2016, its possible to have multiple OS in the same cluster like Windows 2012 and 2016, this will force the cluster to operate in a lower function level for the compatibility reasons.
To address this issue we need to raise the Cluster Function Level using the following command 
Update-ClusterFunctionalLevel and press enter
after running this command and confirm the changes to the function level, I was able to configure the cloud witness via the GUI :)

Sunday, June 24, 2018

Powershell: Test RPC Connection and the RPC higher ports

I had an issue in troubleshooting the RPC connection between two networks.
Services that use RPC will connect to the destination server on port 135 and obtain the list of higher random RPC ports.

The script will connect if used with no parameter to localhost, or can be called using the -Servername parameter and set the computer you want to scan.
You will need to download a tool named PortQry The script will use this tool in the discovery after the script gets the list of services, it will run a test-netconnection on the port and return a True if it was reachable
NOTE: Please make sure that the Portqry.exe is in the following path "C:\PortQryV2\


if (Test-Path "C:\PortQryV2\"){

        $RPCPorts=  C:\PortQryV2\PortQry.exe -e 135 -n $Servername  | findstr "ncacn_ip" | Select-Object -Unique
            if ($RPCPorts.length -eq 0){
                Write-Host "No output, maybe incorrect server name" -ForegroundColor Red
        ForEach ($SinglePort in $RPCPorts){
        $porttocheck=$porttocheck.Remove($porttocheck.Length -1)
        $Result=Test-NetConnection -ComputerName $Servername -Port $porttocheck
        Write-Host "Port health for $Servername on port $porttocheck is " -NoNewline
        Write-Host $Result.TcpTestSucceeded -ForegroundColor Green

        Write-Host $_.Exception.Message -ForegroundColor Red


    Write-Host "PortQry is not found"

The output should look similar to this

Thursday, July 13, 2017

Get All Windows Services account that are configured to use a domain account in the trusted network

This small script will get all the services that are configured with a RunAs account
The service will be using several possible accounts like LocalSystem , Network Services.. and also a domain account.
The common thing is the service that will be using a domain account should have the UPN (FQDN) or SamAccountName (NetBIOS)

$Netbios=(Get-ADDomain).NetBIOSName #Get the Domain NetBIOS Name
$fqdn=(Get-ADDomain).dnsroot #Get the Domain FQDN Name

#The WMI Query that will be used
$WMIQuery="select * from Win32_Service where startname like '$Netbios%' or startname like '%$fqdn'"

#Getting computer list from AD, you can use the filter that fit your criteria, in my case, I have used a computer name as my filter criteria, you can use the search base.
Then I am executing the gwmi Get-WMIObject on the computer I got from the pipeline 
Get-ADComputer -Filter {name -like "*MyServers*"} | foreach {gwmi -ea SilentlyContinue -ComputerName $_.DNSHostName -Query$WMIQuery}  |ft -AutoSize SystemName,caption,startname 

The result should be something like this.

SystemName                       caption                                         startname        
----------                               -------                                             ---------        

HQ-SRV-N1       SQL Server Reporting Services                  Domain\report

Friday, January 13, 2017

Microsoft Direct Access Failed After Migration, V2V,P2V or changing nic adapter

To make long story short.
This happense because the Network Adapter (NIC) GUID in the GPO is not exist in the server.
Backup the DirectAccess Server GPO.
Edit the registry.pol (using 3rd party software)
Import it again to the server.
Enjoy your weekend

Full Story.
You are using MS DA (Microsoft Direct Access) and users are happy (as well as you hopefully).
But in some cases were you have to perform V2V or P2V or what ever kind of migration from one server to another, or from one hypervisor to anohter.
After the a successful VM migration, you start with the hypervisor tools install/upgrade.
Add the IP address to the nics and the server seems to be OK.
But DA failed and client are stucked in Connecting... state. 

So whats going on.
- DA Store its configuration as a GPO and link it with the top level domain, and use Security Filter to apply the settings to the DA Server.
All DA settings are stored in the GPO and so the IP Addresses and GUID for the network interface used.
As new drivers are presented to the server DA Server, a new Network Drivers are also presented with new GUID, which simply dont match the one in the GPO.
The Server will try to apply the GPO setting and will find a mismatch and will failed to apply the policy.
So if you open DA console > Operations Status

Click on DirectAccess and VPN

Click on Remote Access Server
You may find the configuration mismatching with the correct configuration and also you cannot change any option, its all grayed out.

Continuing the wizard and trying to apply the policy will failed.

How to Fix
As I mention earlier, the Network GUID stored in the GPO are for the old NIC which are no longer exist.
Open the DA GPO > Settings > Policies > Administrative Templates > Extra Registry Settings.
and write down the following 

This is the GUID for the old NIC and we need to update these value with GUID for the new NIC 
To Get the Current GUID that are assigned with each network interface, use the following powershell command

Get-WmiObject -Query "select * from win32_networkadapter" | select GUID,NetConnectionID | where {$_.GUID -notlike $null} | ft -AutoSize

The return result are something like:
GUID                                   NetConnectionID
----                                           ---------------
{3DD2D838-35A2-4E05-8A4A-364F8801FCA5} External (The External Interface)
{5D946A6A-63A5-4FFA-951D-A7421EA2068D} Internal (The Internal Interface)

These GUID should be updated in the GPO, and as I did not find a way to directly edit (Extra Registry Settings), I backup DA GPO, and save it some where.
To update these value we need to download and install the following application Registry Workshop
Run the application and open the location you stored the backup GUID\ Domainsysvol\GPO\Machine\Registry.pol
Expand the tree to the following directory

double click on InternalInterface and replace it with the new GUID you get from the powershell script, and same for the InternetInterface, save the file and close the application.
make sure that the replication is finished.
Run Gpupdate /force /target:computer
Open again Remote Access Management (DA Console) > DirectAccess and VPN > Remote Access Server and you will see the configuration are now correct
Click next and update with the correct values

happy working Direct Access
You may get a DNS Warning, just simple remove the DNS Servers from the Infrastructure Server and re-add them.