Where the issue really presented itself is when we encountered a local Active Directory that produced extremely large file exports. However, this is what presented the problem in file format. Using PowerShell 2.0 running on the DC, and writing directly into an XML format is all non-wonderful and not the ideal final direction. That said, we needed to convert XML to JSON in this current example. We have another utility where we push this data into a Cosmos DB so we can better interrogate and export data using SQL versus a pile of different scripts. users) and it would write everything out for us. Using the “Export-Clixml” command, we could pass it a collection of objects (e.g. In doing so, we needed to leverage a file format that would also give us some depth to handle arrays without having to handle those separately. Thus, PowerShell version 2.0 was the target (gross, I know). In order to make the script more portable and allow it to run on a Windows Server 2008 R2 Domain Controller, we needed to run in a PowerShell version that we knew would be there, work, and remove the need to install additional software. This all started as we were creating a PowerShell script that extracted data from local Active Directory. We’ll provide more details, but for now, here is a comparison of 50,000 rows of information where they each have the exact same data.Īs you can see here, Excel is far and away the winner. TL DRĮxcel file format is the smaller file size. While some of the decisions are made for you because of the receiving ends’ requirement, there are times when you need to make the decision on your own. For many organizations and people that produce data for a variety of reasons, there is often a decision to be made regarding which file format to put it in.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |