My PS Script doesnt throw any errors nor loads data into a flat file

Welcome Forums General PowerShell Q&A My PS Script doesnt throw any errors nor loads data into a flat file

Viewing 20 reply threads
  • Author
    Posts
    • #272713
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Hi All i have a PS that works fine from SERVER A , as we are migrating i moved the PS script to SERVER B , when i execute i do not get any error message and the flat file doesnt load any records.

      The task Scheduler uses this action (F:\PVD_GIS_Extracts\PowerShell_Scripts\DataExtract-Parcel.cmd)

      DataExtract-Parcel.cmd code is

      powershell “& “”F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1″””

      Please advice.

      Thanks

    • #272722
      Senior Moderator
      Topics: 10
      Replies: 196
      Points: 1,051
      Helping Hand
      Rank: Community Hero

      What flat file? what records?

      Why are you using a .cmd file to run a PowerShell script, rather than running the script directly?

      What are the differences between server A and B? are they the same OS? are the user accounts migrated, or re-created from scratch? is this a local AD domain, or Azure, or some other cloud hosted infrastructure? are the shared drives/file system paths the same? are the group policies the same?

      Have you reviewed the script for things that will be affected by configuration differences between server A and B? Have you reviewed your ACLs?

    • #272728
      Participant
      Topics: 5
      Replies: 177
      Points: 686
      Helping Hand
      Rank: Major Contributor

      I’d say you don’t really need the intermediate .cmd file.  You can just create a scheduled task to call powershell with the argument of F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 Having said that, it doesn’t explain why it worked on one machine and not the other.  To troubleshoot, log into SERVER B and make sure F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 is a correct path.  If so, just try to run it manually.  If it works, then the problem is with your scheduled task and not the script (which you didn’t post).  If it still doesn’t work, then the issue is probably with the script itself.  If you post it, I may be able to help you troubleshoot it.

    • #272731
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      What flat file? what records?

      Why are you using a .cmd file to run a PowerShell script, rather than running the script directly?

      What are the differences between server A and B? are they the same OS? are the user accounts migrated, or re-created from scratch? is this a local AD domain, or Azure, or some other cloud hosted infrastructure? are the shared drives/file system paths the same? are the group policies the same?

      Have you reviewed the script for things that will be affected by configuration differences between server A and B? Have you reviewed your ACLs?

      Thanks ,

      Sorry my apologies Server A is 2008 and Server B is 2016. We have been having the issue and we with server team went through everything and they all are identical. Also this PS is part of 5 other PS scripts that were migrated over. Rest all scripts have no issue only this one.

      Thanks

    • #272737
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      I’d say you don’t really need the intermediate .cmd file. You can just create a scheduled task to call powershell with the argument of F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 Having said that, it doesn’t explain why it worked on one machine and not the other. To troubleshoot, log into SERVER B and make sure F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 is a correct path. If so, just try to run it manually. If it works, then the problem is with your scheduled task and not the script (which you didn’t post). If it still doesn’t work, then the issue is probably with the script itself. If you post it, I may be able to help you troubleshoot it.

      Thanks. I tried from task scheduler , manually , with the same results. I will post the script here.

      Thanks

    • #272758
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      I’d say you don’t really need the intermediate .cmd file. You can just create a scheduled task to call powershell with the argument of F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 Having said that, it doesn’t explain why it worked on one machine and not the other. To troubleshoot, log into SERVER B and make sure F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 is a correct path. If so, just try to run it manually. If it works, then the problem is with your scheduled task and not the script (which you didn’t post). If it still doesn’t work, then the issue is probably with the script itself. If you post it, I may be able to help you troubleshoot it.

      Thanks. I tried from task scheduler , manually , with the same results. I will post the script here.

      Thanks

      Thanks MikeR i ran the task scheduler as F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1 and its taking forever, usually it takes 3 mins.

    • #272959
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member
      • This reply was modified 3 days, 1 hour ago by grokkit. Reason: code formatting
    • #272989
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Mark,
      I posted the script here , please advice where it is wrong
      Thanks

    • #273034
      Participant
      Topics: 17
      Replies: 1869
      Points: 3,631
      Helping Hand
      Rank: Community Hero

      Are all of these PVD cmdlets an internal developed API wrapper? Nothing posted is necessarily wrong and per your post you are not getting errors, it’s just taking a long time to execute. First, what is “a long time”? Does it finish? If you execute this manually on the server, does it complete? Some of the things that stand out to me is Out-PVDQueryToFile. Why do you need a wrapper for this? Normally, you establish a session and perform a GET, in your case a SQL query. That query should be returned and be a PSObject to use Powershell export methods. New-Session | Get-Something | Export-Csv -Delimiter ‘|’. I would run this in pieces to determine where time is being consumed. Create the session. Run the query. Run the parse to a file.

    • #273052
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Thanks Rob Simmers. I think i am confusing everyone here. Like i mentioned there are two servers and there are 5.ps1 scripts on the old server. All these 5 scripts are copied to the new server . Out of 5 all the other 4 are working , expect this one (posted script). Here are the cases on the new server for the problematic script.
      Case 1 : This script when executed using the task scheduler
      The task Scheduler uses this action (F:\PVD_GIS_Extracts\PowerShell_Scripts\DataExtract-Parcel.cmd)

      DataExtract-Parcel.cmd code is

      powershell “& “”F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1″””
      It completes in 2 minutes and NO DATA is loaded.
      2) When i just run “”F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1″”” using task scheduler it run for hours and wont stop, until i manually stop it.
      On OLD SERVER this takes 3 mins and loads the data 350,000 records.
      3) I manually executed the script and it completes in 3 minutes with NO RECORDS loaded.
      Thanks

    • #273055
      Participant
      Topics: 17
      Replies: 1869
      Points: 3,631
      Helping Hand
      Rank: Community Hero

      2) When i just run “”F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1″”” using task scheduler it run for hours and wont stop, until i manually stop it.

      You are running the script directly as the Program\Script? This would just be attempting to open the PS1, not execute the PS1. Just like the batch file, you need to call Powershell.exe and pass the script as the parameter:

      The Program\Script would be C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe and -File … would be the Argument.

    • #273058
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Thanks Rob Simmers, apologies for my ignorance regarding PS,
      So should i just open a new window in PowershellISE and paste (C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -File F:\PVD_GIS_Extracts\PowerShell_Scripts\ParcelExtract_CLOUD-Email.ps1) and RUN it?

    • #273061
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Thanks in task scheduler created a new task
      Now it run few seconds and loads nothing

    • #273433
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Hi Any update on this?
      Thanks

    • #273466
      Participant
      Topics: 17
      Replies: 1869
      Points: 3,631
      Helping Hand
      Rank: Community Hero

      You are not providing a great deal of information to assist you. Assuming you have the task setup to execute the script properly, next step might be validating that the script actually executing. Add a line to the beginning of the script:

      If you run the task and see the file, then the script is executing. If it executing, then you need look at what is incorrect in the script. What credentials are running the script. Are all of the settings and credentials correct in the script?

    • #273508
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Thanks to all. Here is what i did , in my SELECT Query instead of bringing in all the350K records i added TOP 100 and i was able to load the data into the flat file. Now my question is , is it some kind of timeout issue? If it is where should i look to increase the timeout?
      Thanks

    • #273571
      Participant
      Topics: 17
      Replies: 1869
      Points: 3,631
      Helping Hand
      Rank: Community Hero
    • #275280
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Thanks All,
      After many iterations what i came to found out was in my SELECT statement if i chose TOP 40000 it loads data, anything more than that no errors but the file doesn’t get loaded. I tried the timeout options provided by Rob Simmers, still the same issue, only 40K records gets loaded into the flat file.

    • #275463
      Senior Moderator
      Topics: 10
      Replies: 196
      Points: 1,051
      Helping Hand
      Rank: Community Hero

      You’ve mentioned that this one script out of a set of five is experiencing an issue, but the others seem fine. What does this script do that is different from the other four? Do the others handle the same records and process them into output files?

      If I am understanding correctly from looking at your script, it connects to a database stored on a cloud server (Oracle VPD?) and submits an SQL query for ~80 fields of the records stored in the database (real estate data of some kind). Are the other scripts providing the custom PVD commands that are being used in this script? Please answer Rob’s question from before:

      Are all of these PVD cmdlets an internal developed API wrapper?

      A query for ~80 fields for 40000 records would produce a results table with more than 3 million values. Is this normal for how you’ve handled this data in the past? I’m not a database expert, but it seems likely that you would run into memory issues at some point, or connection limitations on Oracle’s side. Have these records been stored in the cloud for a long time, or were they migrated recently?

      Frankly, due to the amount of data involved, this looks like a job for an actual application and not a script.

    • #275766
      Participant
      Topics: 17
      Replies: 1869
      Points: 3,631
      Helping Hand
      Rank: Community Hero

      If this is a wrapper for an API and you can get a specific number like that before it stops, this points to throttle limits on the API. This goes into paging configuration of the API calls. Say the limit is 800 API calls within 15 minutes and paging is returning 50 records per page, then you are hitting a throttle limit in the API. If you want 80000 records, you would have to page with 100 records. If you were running this on another server without issue, that does not make much sense unless an IP\Name was whitelisted on the API side to ignore API calls.

    • #276204
      Participant
      Topics: 1
      Replies: 12
      Points: 41
      Rank: Member

      Thank you all for your suggestions and time. At this time both network team and DBA looked and did not find anything unusual. I will probably use some other process to load the data.
      Thanks

Viewing 20 reply threads
  • You must be logged in to reply to this topic.