Monday, June 1, 2020

Convert to GPT from MBR without losing any data (work on Exchange DAG)

Hi,
Recently my exchange server 2016 DAG was running out of space, my database reaches 2 TB and the MBR cannot be extended anymore, so I need to convert to GPT. I tried Diskpart, which is a good builtin tool using the following command, but it fails and returns the following error

Virtual Disk Service error:
The specified disk is not convertible. CDROMs and DVDs
are examples of disks that are not convertable.

Based on some googling and reading it seems that this command will work fine if I clear the disk (format it) which cause a full data lost on the disk, and this is not applicable on my case, I have 2 exchange server 2016 with 2TB of data on one of the disk, resyncing the data will take several days.

The Fix
I find this super tools
https://sourceforge.net/projects/gptgen/files/gptgen/v1.1/gptgen-1.1.zip/download
Just download it and place it on drive C: or whatever drive other than you want to convert and run the following command gptgen.exe, the output will be

where device_path is the full path to the device file,
e.g.\\.\physicaldrive0.

Available arguments (no "-wm"-style argument combining support):
-b , --backup : write a backup of the original MBR to
-c nnn, --count nnn: build a GPT containing nnn entries (default=128)
-h, --help, --usage: display this help message
-k, --keep-going: don't ask user if a boot partition is found
-m, --keepmbr: keep the existing MBR, don't write a protective MBR
-w, --write: write directly to the disk, not to separate files

One of the required parameters is the disk number and we can get it from Diskpart, using the following command
> Diskpart
>> List disk


You need to know the disk number you want to convert. Let's assume that the disk you want to convert is disk 1, just remember the number, now exit Diskpart and navigate to GPTGen and execute the following command
gptgen.exe -w \\.\physicaldrive1

The -w parameter is used to write GPT table and is required in the converting process, just make sure to write the correct disk number, in my case, it's 1.
The output will be similar to


That easy, GPT without losing any data
No need to reboot, even though I recommend to reboot and make sure that everything is working fine. Also, I recommend before doing, try to minimize IO on the disk, place the server in maintenance mode, turn the services off, and also backup your data if you can. Gptgen have the option to backup the current MBR before converting it.

Enjoy :D



Monday, January 27, 2020

PSscript to get list of share and perform test on them


The Problem
In my environment, I got about 200 VM and some of the servers physical servers and others in a remote site. One of the critical security issues is the open share, where the users can Read/Write/Delete data, usually, these shares are created due to a misconfigured share, or shares that are intended to be temporary and then suddenly these shares became critical and placed for production.
Power users who have the ability to create share and manage permission may forget to set proper permission, sometimes they follow the easy way of Grant Everyone everything in both NTFS and share, which expose all the files to all the users in some cases these files are critical files.
There are several scripts on the Internet to search for share folder in the servers, but usually the result will be slow, incompatible, or with a lot of noise, such as Admin share. Also, another challenge is even after getting the share list, how can the admin confirm if a user can Access/Read/Write/Delete files in the shared folder without going though the long list of ACL for both share and NTFS?!

Powershell Script
To address this issue, I created a Powershell script that will do the following
  1. Get a list of all AD Computer using filter operatingsystem -like "*server*" so only servers are retrieved.
  2. Test the computer reachability and ensure that it's responding
  3. For each server matching the filter, get the share list using WMI query. The share list only include the Folders and Admin Share such as Admin$, C$, IPC$ ..., excluding printers
  4. The Script can read cluster share
  5. If Impersonation parameter is set to $true then the script will request to use another user credentials to perform Access,Write, Read, Delete on each share, usually these credential are for a limited user:
    • Create a PSDrive and assign any available letter to the drive, this will use any available letter. At least make sure that you have 1 available letter :)
    • If the Map was success, then the users can Access and the Access is set to Yes, else, all the other test will stop as I assume that if the user don't have the right to access a folder so this user wont have write, read or delete access permissions.
    • If the Access was success, then the script proceeds in Writing a file to the destination as AdcciTestPSWrite.txt
    • If The Write was success, the script will proceed in Reading it.
    • The last step is to remove the created file, and the result will be updated
      • The script will not remove random files, it will only try to remove the file it wrote as part of the test.
Limitations:

  1. The script may report that the user has full access (Access, Read, Write, Delete) in a folder where Creator Owner has full control, such as Folder Redirection, Home Folder. I will fix this issue in the coming version. But for now its good :).
  2. There might be a duplicated result when the file server cluster have muti rules. also will work on fixing this issue on the next version. 
Output
The output is a CSV file, you can format it in the way you want that show the folder name and a table of the permission the impersonated user can do


Download the scrip, try it and let me know

https://drive.google.com/drive/folders/1NL8YqCI1jBWzhtcWNcMCRihjp2Tn3nJa?usp=sharing


You can reach me farisnt@gmail.com




Friday, March 15, 2019

Could not find this item

Recently I noticed some wired behavior of my test file server.
Whenever I remove a folder, the folder icon remains there, even though it deleted, but it's still showing.
if I double click on it, the folder open and then automatically exit "Go back one level up".
If I try to remove the folder again, I got the message below
Using Sysinternal Procmon.exe, I got the following message when trying to remove the folder. NAME NOT FOUND.
I noticed also using the CMD the folder name is ending with white space.
My folder name is VDI1, using the CMD with auto-complete I got this:
rd "vdi1 "

FIX:
To remove the file use the following command to remove the folder
rd "\\?\d:\Shares\Home\vdi1 "
use the TAB while typing the path for auto-complete, as the end of the folder contains white space.
using this way. I was able to remove the folder.





Friday, March 1, 2019

Get MyDocument and Desktop Folder size for all computer in the network


This Script will read the computer list from a CSV file and scan all the computer in the list to get the current size for MyDocumet and Desktop folder in the remote PC.
This script is very useful to size your VDI environment and get an idea about the expected user profile, especially when all the users are using regularly PC and storing all their content on the desktop and MyDocuments
I chose to get the information from the CSV file as I request from the network team for the most recent computers list which is connected to the network. the active directory can contain some old computer object or test pc, which will increase the percentage of failed scan. Anyway, if you prefer to get the list from AD you still can do it by using 
Get-ADComputer -filter * | Export-Csv -NoTypeInformation -Path c:\temp.csv




#############################################################
# This Script is written by Faris Malaeb
# Feel free and contact me farisnt@yahoo.com, farisnt@gmail.com
# http://farisnt.blogspot.com/
#
#This script will scan the network and get the users desktop and mydocument profile size
#This Script will help you in giving you an idea about the users data
# I wrote this script as I needed to know how much users have in their desktop and MyDocument folder
#These data should be moved to the servers as a part of VDI Project implementation.
#
#
# Get the computer objects
# Tested on Windows 10
#
#To use the Script .\Get-DesktopnDocs.ps1 -FullPathForSoruce C:\PCList.csv -ResultDestination C:\MyFinzalResult.csv
#You can use the -Verbose parameter for more information during the runtime
#############################################################


param(
[parameter(Mandatory=$False)]$FullPathForSoruce="C:\PCs.csv",
[parameter(Mandatory=$False)]$ResultDestination="C:\MyFinzalResult.csv"
)
$Collection=Get-Content -Path $FullPathForSoruce
#The final result will be added to this array
$FullResult=@()

try{
    Write-Host "Validating the destination path $ResultDestination" -ForegroundColor Green
    Add-Content -Path $ResultDestination -Value "temp"
    Remove-Item $ResultDestination
    }
catch{
    Write-Host "Failed to write to path $ResultDestination... Exiting " $_.exception.message
    break
    }


#Looping through all the computer object
foreach($SinglePC in $Collection){


#Checking if the computer is alive or dead
    try{ (Test-Connection -ComputerName $SinglePC -Count 1 -ErrorAction Stop |Out-Null)

        #I need to have an object which contain ComputerName, Deskop, MyDocument and Username properties
        #I will name this object as TotalResult
        $totalresult =New-Object psobject
        $totalresult | Add-Member -NotePropertyName ComputerName -NotePropertyValue ""
        $totalresult | Add-Member -NotePropertyName Desktop -NotePropertyValue ""
        $totalresult | Add-Member -NotePropertyName MyDocument -NotePropertyValue ""
        $totalresult | Add-Member -NotePropertyName username -NotePropertyValue ""
        $totalresult.ComputerName= $SinglePC

        #Each properties of the newly created object will contain the output from a command invokation
        #Getting the username of the active user from the remote machine.
        #I am getting the username through reading the running process which run under the user context.
        #My Domain name is Domain-local, and as the client OS only support one login, so all the process should have the current active user
        #I used the $env:USERDOMAIN System Variable to get the domain name, lets assume that my domain is named as Domain-Local
        #After getting the list of processes running under the username Domain-local\..., I select the first Item in the array and removing the domain name
        #This will only return the username, and then store the result in the $TotalResult.Username Properties
        $ScriptToExecute={
        ((Get-Process -IncludeUserName |where {$_.username -like "$env:USERDOMAIN*"})[0]).username.substring(([int]$env:USERDOMAIN.Length+1))
        }
        $totalresult.username=invoke-command -ComputerName $SinglePC -ScriptBlock $ScriptToExecute

        #The command which will be invoked on the remote computer
        $ScriptToExecute= {
        $UserNViaProcess =$args[0] #read the argument that will be passed to the script block through the invoke-command -ArgumentList property
        #The Lines below will get the Desktop items which have the extension below.
        #the Measure-Object will calculate the value a group based on numeric property, and in our case the value I want to measure is the Length
        #After the calculation I devide the value by 1GB to get the result in GB
        ((Get-ChildItem -Path "C:\users\$UserNViaProcess\Desktop" -Recurse -Include @("*.docx","*.xlsx","*.pptx","*.txt","*.mmp","*.jpg","*.png","*.pdf","*.vsdx")|Measure-Object -Sum Length).Sum /1GB)
        }
        #Executing the remote command invoke and passing the arguament list, which is the username
        $totalresult.Desktop=invoke-command -ComputerName $SinglePC -ScriptBlock $ScriptToExecute -ArgumentList $totalresult.username

                                                                           
        $ScriptToExecute={
        $UserNViaProcess =$args[0]
        ((Get-ChildItem -Path "C:\users\$UserNViaProcess\Documents" -Recurse -Include @("*.docx","*.xlsx","*.pptx","*.txt","*.mmp","*.jpg","*.png","*.pdf","*.vsdx") |Measure-Object -Sum Length).Sum /1GB)
        }
        $totalresult.MyDocument=invoke-command -ComputerName $SinglePC -ScriptBlock $ScriptToExecute -ArgumentList $totalresult.username
 
        #Display the result in the console
        $totalresult
        #Adding the Result to the array
        $FullResult+=$totalresult

        }


    Catch {

            Write-Host "I failed on " $SinglePC " Error is " $_.exception.message
         }

    Finally{
    $FullResult | Export-Csv $ResultDestination -NoTypeInformation

        }
}




Tuesday, November 13, 2018

Cloud Witness option is disabled in Failover Cluster

I had my SQL Server Cluster AG up and running and using the local File Witness Sharing, but later on I decided to move witness to the Azure cloud as I need to expand my cluster to a DR Site.
When I try to change the cluster quorum setting to I got this
There is no indication why, nothing in the windows event log, or cluster log.
I tried to use Powershell to change the setting using the command in the following URL

Set-ClusterQuorum -CloudWitness -AccountName -AccessKey
But I got the following error message

Set-ClusterQuorum : The current cluster functional level does not support cloud witness. Please update your cluster
nodes to Windows Server 2016.
At line:1 char:1
+ Set-ClusterQuorum -CloudWitness -AccountName -AccessKey 

In Windows 2016, its possible to have multiple OS in the same cluster like Windows 2012 and 2016, this will force the cluster to operate in a lower function level for the compatibility reasons.
To address this issue we need to raise the Cluster Function Level using the following command 
Update-ClusterFunctionalLevel and press enter
after running this command and confirm the changes to the function level, I was able to configure the cloud witness via the GUI :)

Sunday, June 24, 2018

Powershell: Test RPC Connection and the RPC higher ports

I had an issue in troubleshooting the RPC connection between two networks.
Services that use RPC will connect to the destination server on port 135 and obtain the list of higher random RPC ports.

The script will connect if used with no parameter to localhost, or can be called using the -Servername parameter and set the computer you want to scan.
You will need to download a tool named PortQry The script will use this tool in the discovery after the script gets the list of services, it will run a test-netconnection on the port and return a True if it was reachable
NOTE: Please make sure that the Portqry.exe is in the following path "C:\PortQryV2\


param(
[string]$Servername="localhost"
)

if (Test-Path "C:\PortQryV2\"){
    Try{
   

        $RPCPorts=  C:\PortQryV2\PortQry.exe -e 135 -n $Servername  | findstr "ncacn_ip" | Select-Object -Unique
            if ($RPCPorts.length -eq 0){
                Write-Host "No output, maybe incorrect server name" -ForegroundColor Red
                return
            }
        ForEach ($SinglePort in $RPCPorts){
        $porttocheck=$SinglePort.Substring($SinglePort.IndexOfAny("[")+1)
        $porttocheck=$porttocheck.Remove($porttocheck.Length -1)
        $Result=Test-NetConnection -ComputerName $Servername -Port $porttocheck
        Write-Host "Port health for $Servername on port $porttocheck is " -NoNewline
        Write-Host $Result.TcpTestSucceeded -ForegroundColor Green
        }

    }
    Catch{
        Write-Host $_.Exception.Message -ForegroundColor Red

    }


}
ELSE{
    Write-Host "PortQry is not found"
}


The output should look similar to this