How to round and pad in the same function?

This topic contains 20 replies, has 4 voices, and was last updated by Profile photo of Ron Ron 2 weeks, 3 days ago.

  • Author
    Posts
  • #58235
    Profile photo of MNGuy33
    MNGuy33
    Participant

    I have a script that converts a CSV file to an ASC formatted file with the data in specific column positions.

    I am trying to take a Sales value in my array that has way too many trailing decimal places and round it to 2 decimal places and then I need it padded as well. The Sales value in the array is formatted like "354.9999999999" or "21.98" or "967.000001" etc. It's not consistent. There are no commas in it.

    Below is the existing old code where it's currently just padding the Sales field. I need it first round the sales number and then pad to 2 decimal places.

        $array = Import-csv -path $FileName.FullName  -delimiter ',' -Header RecordType,Date,Time,Code,Sales,Customers,Items
        $arr = $array | foreach-object {$_.RecordType + ' ' + $_.Date + ' ' + $_.Time + '    ' + $_.Code.padleft(5,' ') + '   ' +$_.Sales.padleft(10,' ') + $_.Customers.padleft(5,' ') + $_.Items.padleft(5,' ') }
  • #58238
    Profile photo of Rob Simmers
    Rob Simmers
    Participant

    I assume you're looking at something like this:

    $values = "354.9999999999","21.98","967.000001"
    foreach ($value in $values) {
        ([string]([math]::Round($value))).PadLeft(4)
    }
    

    If that is the case, there is an easier way than looping using calculated expressions:

    $array = Import-csv -path $FileName.FullName  -delimiter ',' -Header RecordType,Date,Time,Code,Sales,Customers,Items
    $array | Select RecordType,
                    Date,
                    Time, 
                    @{Name="Code";Expression={([string]([math]::Round($_.Code))).PadLeft(5)}},
                    @{Name="Sales";Expression={([string]([math]::Round($_.Sales))).PadLeft(10)}},
                    @{Name="Customers";Expression={([string]([math]::Round($_.Customers))).PadLeft(5)}},
                    @{Name="Items";Expression={([string]([math]::Round($_.Items))).PadLeft(5)}}
    
  • #58241
    Profile photo of Olaf Soyk
    Olaf Soyk
    Participant
  • #58250
    Profile photo of MNGuy33
    MNGuy33
    Participant

    Hello to both, neither of these solutions helped/worked. Let me clarify a bit more of what I am trying to do.

    I have a CSV file that is being converted to ASC format by way of this powershell script. The CSV file contains the following columns and formats.

    RecordType = the letter "D"
    Date = YYYYMMDD
    Time = HHMM in military time format no colon
    Code = 5 digits of text
    Sales = number with decimals sometimes, sometimes not. Examples of values include (4.0099999999999998, 31.02, 85, 3.5800000000000001)
    Customers = number with no decimals
    Items = number with no decimals

    Sample of the CSV:

    Record Type,Date,Time,Code,Sales,Customers,Items
    D,20151106,0215,MERCH,4.0099999999999998,1,3
    D,20151106,0515,FDSVC,1.1899999999999999,1,1
    D,20151106,0530,FDSVC,1.5900000000000001,1,1
    D,20151106,0600,FDSVC,0.98999999999999999,1,1
    D,20151106,0615,FDSVC,21.18,8,13
    D,20151106,0615,MERCH,22.02,5,6
    D,20151106,0715,MERCH,21.489999999999998,6,11

    The Sales column is what is causing me so much trouble. I need the trailing decimals to not be there, it needs to be only 2 decimal places.

    The output once in ASC format needs to be in a very specific column layout, hence the padding I have in my script).

    A sample ASC output format should look like this:

    D 20151107 0630    FDSVC        10.76    1    4
    D 20151107 0945    FDSVC         5.46    3    4

    I have never coded in Powershell before, I am simply trying to update the existing script that I have to try and get the Sales number to round AND padleft.

    Your help is very much appreciated!

  • #58253
    Profile photo of Olaf Soyk
    Olaf Soyk
    Participant

    First: You should have more faith in us! 😉 😀
    Second: You really should start to learn Powershell. I am sure it will pay off for you in the future. 😉

    $Array = Import-Csv -Path -place your path here-\sample.csv -Delimiter ','
    $Array | 
    Foreach-Object -Process { "{0,1} {1,7} {2,4} {3,8} {4,12:n2} {5,4} {6,4}" -f $_.'Record Type',$_.Date,$_.Time,$_.Code,[Math]::Round($_.Sales,2),$_.Customers,$_.Items} | 
    Out-File -FilePath -place your path here-\asc.asc -Encoding utf8 -Append -Force
    
  • #58280
    Profile photo of Rob Simmers
    Rob Simmers
    Participant

    Much easier to help with examples and expected output. 🙂

    • #58282
      Profile photo of MNGuy33
      MNGuy33
      Participant

      Thanks for the help, unfortunately it's still not working. Here is what it is outputting....

      D 20151106 0215    MERCH   4.0099999999999998  
      D 20151106 0515    FDSVC   1.1899999999999999  
      D 20151106 0530    FDSVC   1.5900000000000001  
      D 20151106 0600    FDSVC   0.98999999999999999 
      D 20151106 0615    FDSVC        21.18    8   13
      D 20151106 0615    MERCH        22.02    5    6

      Also, if it helps you out, I've pasted the entire larger script code below. There is a lot of other stuff it does so I initially didn't paste it all as to not overload you with info.

      if (Get-Variable RootDir -Scope Global -ErrorAction SilentlyContinue)
      {
        $FromMainEpos = 'Y'
      }
      else
      {
        write-host 'Reading Config file'
        $MyRunPath=Split-Path $script:MyInvocation.MyCommand.Path
        $scriptPath = "$MyRunPath\ReadConfig.ps1"
        Invoke-Expression "$scriptPath"
        $sftplocation     = "$RootDir\Scripts\temp\tempholding"  
        $EPOSdir 		    = "$RootDir\Data"
        $logdir           = "$RootDir\Scripts\logs"
        $logfile          = "$logdir\log-overall.txt"
      }
      
      #------------------------------------------------------------------
      # General Variables
      #------------------------------------------------------------------
          $date 			= get-date
          $scriptname 	= $MyInvocation.MyCommand.Name
          $servername		= get-content env:computername
      
          ## Splunk Logging variables
          # $MyAccount - take from teh Configuration file	
          # Set the datestamp just before each log message
          $MySystem='BAT'
          $MyModule='Auto Down Load'
      	$masterlog = "$RootDir\logs\eposlog.txt"
      
      #------------------------------------------------------------------
      # return a result of 0 to indicate all extra process was successful 
      # return a result of 1 to force the whole process to exit
      $result = 0
      #------------------------------------------------------------------
      
      $SourcePath = "$RootDir\Scripts\temp\tempholding"
      $TargetPath = "$RootDir\Scripts\Temp\PointofSales" 
      
      
      $MyDateTimeStamp = Get-Date -f "yyyy-MM-dd HH:mm:ss,fff"
      $MyMessage='SFTP CSV to ASC conversion Starting'
      $MyPriority='INFO'
      $LogMessage = $MyDateTimeStamp + ' | ' + $MyPriority.padright(6).substring(0,6) + ' | ' + $MyAccount.padright(10).substring(0,10) + ' | ' + $MySystem.padright(3).substring(0,3) + ' | ' + $MyModule.padright(20).substring(0,20) + ' | ' + $MyMessage
      Add-Content $logfile  $LogMessage
      
      
      $Prefix = ""
      $FilterVal   = $Prefix + '*.csv'
      $CSVFiles = Get-ChildItem -Path $SourcePath -Filter $FilterVal | sort -unique
      ForEach ($FileName in $CSVFiles)
      {
        if ($FileName.FullName  -ne $null)
        {
          $array = Import-csv -path $FileName.FullName  -delimiter ',' -Header RecordType,Date,Time,Code,Sales,Customers,Items
      	$arr = $array | Foreach-Object -Process { "{0,1} {1,7} {2,4} {3,8} {4,12:n2} {5,4} {6,4}" -f $_.'Record Type',$_.Date,$_.Time,$_.Code,[Math]::Round($_.Sales,2),$_.Customers,$_.Items}
          # $arr = $array | foreach-object {$_.RecordType + ' ' + $_.Date + ' ' + $_.Time + '    ' + $_.Code.padleft(5,' ') + '   ' + [math]::Round($_.Sales,2).PadLeft(10,' ') + $_.Customers.padleft(5,' ') + $_.Items.padleft(5,' ') }
      	$BaseName = $FileName -replace ".csv", ""
          $OutputFile = $TargetPath + '\' + $BaseName + '.asc'
          # $MyDateTimeStamp = Get-Date -f "yyyy-MM-dd HH:mm:ss,fff"
          # $MyMessage='Input File = ' $FileName.FullName    'Output File = '  $OutputFile
          # $MyPriority='INFO'
          # $LogMessage = $MyDateTimeStamp + ' | ' + $MyPriority.padright(6).substring(0,6) + ' | ' + $MyAccount.padright(10).substring(0,10) + ' | ' + $MySystem.padright(3).substring(0,3) + ' | ' + $MyModule.padright(20).substring(0,20) + ' | ' + $MyMessage
          # Add-Content $logfile  $LogMessage	
      	$SkipLine='Y'
          foreach ($i in $arr)
      	{  
            $Record = $i.substring(0,47)
      	  If ($SkipLine -eq 'N')
      	  {
              Add-Content $OutputFile $Record
            }
      	  $SkipLine='N'	  
      	}
        }
      }
      
      $MyDateTimeStamp = Get-Date -f "yyyy-MM-dd HH:mm:ss,fff"
      $MyMessage='SFTP CSV to ASC conversion complete'
      $MyPriority='INFO'
      $LogMessage = $MyDateTimeStamp + ' | ' + $MyPriority.padright(6).substring(0,6) + ' | ' + $MyAccount.padright(10).substring(0,10) + ' | ' + $MySystem.padright(3).substring(0,3) + ' | ' + $MyModule.padright(20).substring(0,20) + ' | ' + $MyMessage
      Add-Content $logfile  $LogMessage
      
      
      #------------------------------------------------------------------
      
      return $result
    • #58285
      Profile photo of Olaf Soyk
      Olaf Soyk
      Participant

      In my experience it helps usually debugging a large script to split it in smaler but still functional chunks. Try to run only the part you asked here to put in a seperate script and see if it works. If not you might have some inconsistencies in your csv file.

      I've seen you import your csv and put some extra headers. Doesn't your csv already have some? Also check if the header names are accordingly named ('RecordType' vs 'Record Type').

    • #58288
      Profile photo of MNGuy33
      MNGuy33
      Participant

      Getting closer!

      It converted properly once I fixed the "Record Type" vs "RecordType" header issue.

      But the first row of data in the file is getting dropped. Powershell is throwing this error...

      Cannot convert argument "0", with value: "Sales", for "Round" to type "System.Double": "Cannot convert value "Sales" to
      type "System.Double". Error: "Input string was not in a correct format.""
      At .ps1:63 char:148
      + $arr = $array | Foreach-Object -Process { "{0,1} {1,7} {2,4} {3,8} {4,12:n2} {5,4} {6,4}" -f $_.'RecordType',$_.D
      ate,$_.Time,$_.Code,[Math]::Round <<<< ($_.Sales,2),$_.Customers,$_.Items} + CategoryInfo : NotSpecified: (:) [], MethodException + FullyQualifiedErrorId : MethodArgumentConversionInvalidCastArgument

    • #58294
      Profile photo of Ron
      Ron
      Participant

      Your input file already has headers and you are creating new ones.

      -Header 
          Specifies an alternate column header row for the imported file. The column header determines the names of the properties of the object that Import-CSV crea
          tes.
      
          Enter a comma-separated list of the column headers. Enclose each item in quotation marks (single or double). Do not enclose the header string in quotation
          marks. If you enter fewer column headers than there are columns, the remaining columns will have no header. If you enter more headers than there are column
          s, the extra headers are ignored.
      
          When using the Header parameter, delete the original header row from the CSV file. Otherwise, Import-CSV creates an extra object from the items in the head
          er row.
      
          Required?                    false
          Position?                    named
          Default value
          Accept pipeline input?       false
          Accept wildcard characters?  false
    • #58300
      Profile photo of Olaf Soyk
      Olaf Soyk
      Participant

      If your csv file originally has headers you don't have to add them while you import it. Otherwise it is necessary. You just posted in your csv sample some lines and rows including headers and in your script you added headers 'again' ... that's what I meant.

    • #58306
      Profile photo of MNGuy33
      MNGuy33
      Participant

      The CSV being imported will always have these headers in the input file. How do I exclude them then in my code?

    • #58309
      Profile photo of Ron
      Ron
      Participant

      There's no reason to rename them unless you suspect they might change (but not the data layout). Just remove the -header parameter and go back to the original name for 'Record type'. If, for some reason you felt you had to rename them, then just skip the first record.

      $array[1..($array.count – 1)] | ...

    • #58315
      Profile photo of Ron
      Ron
      Participant

      You can also get rid of the skip line logic in your output. They headers won't be in your output array.

    • #58316
      Profile photo of MNGuy33
      MNGuy33
      Participant

      I am not sure what you mean by this.... $array[1..($array.count – 1)] | ...

      I'm a total newbie here. 🙂

      Below is the code as it stands based on your recommendation to remove the -header info. The scripts runs without error now, however the first row of actual data from the CSV input file is missing from the output ASC file.

      **********EDIT: I changed the skip line parameter to = 'N' and all is working now!!!*****************

      $array = Import-csv -path $FileName.FullName  -delimiter ',' 
      	$arr = $array | Foreach-Object -Process { "{0,1} {1,7} {2,4} {3,8} {4,12:n2} {5,4} {6,4}" -f $_.'Record Type',$_.Date,$_.Time,$_.Code,[Math]::Round($_.Sales,2),$_.Customers,$_.Items}

      Input data:

      Record Type,Date,Time,Code,Sales,Customers,Items
      D,20151106,0215,MERCH,4.0099999999999998,1,3
      D,20151106,0515,FDSVC,1.1899999999999999,1,1
      D,20151106,0530,FDSVC,1.5900000000000001,1,1

      Output ASC file:

      D 20151106 0515    FDSVC         1.19    1    1
      D 20151106 0530    FDSVC         1.59    1    1
    • #58312
      Profile photo of Olaf Soyk
      Olaf Soyk
      Participant

      Why don't you play a little bit around in the console until you understand exactly what's going on? In the console you have immidiate response and you can see how it works.

      Get-Help Import-CSV -Full

      Try to compare this:

      Import-CSV -Path 'Your file path here' 

      with this:

      Import-CSV -Path 'Your file path here' -Header 'Your headers here'

      ... you know what I mean?

      BTW: You can debug your script line by line in the ISE.

    • #58324
      Profile photo of Ron
      Ron
      Participant

      Sometimes there's no getting around reading the help files. 🙂

      See: help about_array

      What I did was index the array from the 2nd line (base 0 array) to the last line.

      But, all of that is unnecessary unless you need to mess with the headers, which seems unlikely. When you use the existing headers, they won't be in the input array or the output array.

  • #58291
    Profile photo of Ron
    Ron
    Participant

    I didn't dig too deep into your script, but I don't see where you clear your output file if it already exists. If you run this multiple times, it will keep accumulating records, which seems to be the case, since some records are in the old format.

    if (test-path $OutputFile) {clear-content $OutputFile}

    • #58297
      Profile photo of MNGuy33
      MNGuy33
      Participant

      The input file name is always a unique file name so I don't need to worry about clearing the output file.

  • #58318
    Profile photo of Olaf Soyk
    Olaf Soyk
    Participant

    I'm a total newbie here. 🙂

    Maybe you should take a little time and start to learn Powershell 'from scratch'. So you understand better how to debug and how to get help and information you need. Here you can find some great recourses to start from: Beginner Sites and Tutorials

  • #58321
    Profile photo of MNGuy33
    MNGuy33
    Participant

    Thank you ALL so much for your help on this! You have no idea how much I appreciate this!

You must be logged in to reply to this topic.