For each($file in $files) iterate twice (not the same as the other post)

Welcome Forums General PowerShell Q&A For each($file in $files) iterate twice (not the same as the other post)

Viewing 0 reply threads
  • Author
    Posts
    • #202931
      Participant
      Topics: 5
      Replies: 11
      Points: 67
      Rank: Member

      Hi again,

      I’m running a code that gets files from a folder, and for each file do something. But the for each file is running twice.

      Tried in other computers, tried in new files, other folders, still getting a twice loop. Added a watch on $files, and the count of its elements its 28

      In the example code, if there are 28 files on the folder, $c value is 56

      The full code:

      Function get-folder()
      
      # Function to open a dialog to select folder
      
      {
      
          [System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms")|Out-Null
      
          $foldername = New-Object System.Windows.Forms.FolderBrowserDialog
      
          $foldername.Description = "Select a folder"
      
          $foldername.rootfolder = "MyComputer"
      
          if($foldername.ShowDialog() -eq "OK")
      
          {
      
              $folder += $foldername.SelectedPath
      
          }
      
          return $folder
      
      }
      
      Function get-files($initialdirectory, $match_string)
      
      # Function to get the root folder and a string to filter the file names, returns a list
      
      {
      
          $files = Get-ChildItem -Recurse $initialdirectory | Where-Object { ! $_.PSIsContainer } | Select-Object Name, FullName
      
          $match_list = @()
      
          Write-Output $files
      
          foreach($file in $files){
      
              if($file.Name -match $match_string){
      
                  $match_list+=$file
      
              }
      
          }
      
          return $match_list
      
      }
      
      Function get-dates($file, $regex) 
      
      # Function to get a date from file with a format
      
      {
      
          $date = Get-Content $file.FullName -First 1
      
          $null = $date -match $regex
      
          $date = $Matches[0]
      
          return $date
      
      }
      
      Function write-file($filetxt, $split_num, $output_file, $machine){
      
          Add-Content -Path $output_file -value $content
      
          foreach($line in Get-Content $filetxt.FullName){
      
              # replace the dots by commas
      
              $line = $line -replace '.', ","
      
              ## format it to get semicolon separated values
      
              $line = $line -replace '\s\s+', ";"
      
              # split the line and take only the desired column
      
              $content = $line.Split(";")[$split_num]
      
              # add the date and the machine to the line
      
              $content = $date+";"+$machine+";"+$content
      
              # add to the file
      
              Add-Content -Path $output_file -value $content
      
          }
      
      }
      
      # Get the root folder of the machines
      
      $root_folder = get-folder
      
      $sw = [Diagnostics.Stopwatch]::StartNew()
      
      # create the final folder
      
      $final_folder = $root_folder+"\FINAL"
      
      if(-not(test-path $final_folder)){mkdir $final_folder}
      
      # iterate over the root_folder subfolders
      
      $folders = Get-ChildItem $root_folder -Directory
      
      foreach($folder in $folders){
      
          $files = get-files $folder.FullName "sa"
      
          if(($files | Measure-Object).Count -gt 0){
      
              $c = 0
      
              foreach($file in $files){
      
                  $c++
      
              }
      
              write-output $c
      
          }
      
          
      
      }
      
      $sw.Stop()
      
      Write-Output "Tiempo total: "$sw.Elapsed " ms"
      
      Write-Output $c
Viewing 0 reply threads
  • You must be logged in to reply to this topic.