Approach ideas?

Welcome Forums General PowerShell Q&A Approach ideas?

This topic contains 1 reply, has 2 voices, and was last updated by

 
Participant
3 months ago.

  • Author
    Posts
  • #162018

    Participant
    Topics: 8
    Replies: 17
    Points: 127
    Rank: Participant

    Looking for ideas on an approach for this.

     

    -Directory contains ~370 xml files , each in their own subdirectory.

    -Need to run a script against all files and append data to a single CSV after parsing elements from these files.

     

    My thoughts are to use an identifier from each xml to append to a new line(They have unique ID numbers) , starting with the first item, then append the rest of the data from that report to the same line. Then, using a ForEach loop, parse through the rest of the $items in $directory. The data in the XML is appended to a specific $column dependent on the xml element.

     

    Does this sound like the best approach? Can ForEach do this?
    Opinions welcomed! Thanks in advance.

  • #162068

    Participant
    Topics: 2
    Replies: 999
    Points: 1,946
    Helping Hand
    Rank: Community Hero

    Sure it's possible. The best approach is a rather opinionated thing. All that matters is that you have something the works for your use case, and meets the KPI's you need.

    Anytime you are looking at a lot of ( or large files) you need to take under considerations that will take as long as it takes. There are performance tweak to do with anything, but based on your simple scan, extract a small piece of data, should not be that big of a hit, depending on how large those XML files are and how much has to be scanned to find the bits you are after.

    Grabbing content is just a matter of Select-String, or RegEx matches and append to a CSV.

     

The topic ‘Approach ideas?’ is closed to new replies.