Azure Data Lake U-SQL parallel jobs failed - u-sql

we are running a u-sql job through ADF that reads a folder with a Avro file and converts the data to csv. Running simultaneous jobs, reading from different folders and outputting to different folders caused the following error
FinalMetadataOperationUserError: Failed to write job meta-data due to user error
Component
JobManager_User
Message
Failed to write job meta-data due to user error
Description
Version of object ddc1c212-b227-4b0b-8199-58bde69ce2e1.master.Avro doesn't exist
at Scope.MetadataService.Client.Usql.WcfBasedMetadataClientForUsql.ExecuteMetadataJob(MetadataJob mdJob)
at MetaDataExecutor.RealMetaDataExecutor.Execute(MetadataJob job)
at MetaDataExecutor.Committer.Commit(MetadataJob job)
Job URL
https://***.azuredatalakeanalytics.net/jobLink/8b604ae4-179d-4375-a68a-c1b6771473ca
Upon checking the job resources, the 8b604ae4... folder indeeed has the Avro dll. Does anyone understand this error message or how can we setup ADF to run parallel u-sql jobs? Our pipeline is pretty straightforward as seen in the image below
Thanks!

You need to register the dll in the database which your ADF is running.
So upload the dll to the ADF like "#CFG\Assemblies" and create a script to register it.
DROP ASSEMBLY IF EXISTS [Avro];
CREATE ASSEMBLY [Avro] FROM #"Assemblies/Avro.dll";
Run the script once and you will have this assembly registered in the database.
Edit: if you want to output to different folders make sure you are using SET ##FeaturePreviews = "DataPartitionedOutput:on"; in your U-SQL code.

Related

Error in generating UnitTestsJUnitReport.xml in Junit-> XML document structures must start and end within the same entity

I have been facing issue from some days, I have a jenkins server and configured to use MSTest.exe for my unit test execution in ASP.net Application. This is weird as this comes for a particular SVN link. Every other SVN source code is executing well.
Below is the log of console:
TestResults\UnitTestReport.trx D:\Sonar\Tools\mstest-to-junit.xsl -o TestResults\UnitTestsJUnitReport.xml
Error occurred while executing stylesheet 'D:\Sonar\Tools\mstest-to-junit.xsl'.
Code: 0x8007000e
Not enough storage is available to complete this operation.

TFS The process cannot access the file because it is being used by another process

I have created an automated build process using TFS which builds a web application.
As part of this process a batch file is used to call ASP Merge to merge my web pages into one dll. I'm using the TFS activity, Invoke Process to do this.
The following is a screenshot of what is output in the TFS build window:
TFS Build Output
Does anyone have any idea how to troubleshoot this issue?
I solved this issue by removing the "Start /high /wait" command that I had in place to start the aspnet_merge tool in a separate window. The reason this was being done is because in a local script we were compiling the code files first using aspnet_compiler before running aspnet_merge.
I also had to split the rest of the file out into a different command file as it was deleting config files I needed.

Publishing failed

We are using Tridion 2011 SP1. Some of the pages/components are getting failed while publishing with below mentioned error.
Phase: Deployer Prepare Phase failed, Unable to unzip,
D:\Inetpub\TridionPublisherFS4SP\incoming\tcm_0-286137-66560.Content.zip (The process
cannot access the file because it is being used by another process),
D:\Inetpub\TridionPublisherFS4SP\incoming\tcm_0-286137-66560.Content.zip (The process
cannot access the file because it is being used by another process), Unable to unzip,
D:\Inetpub\TridionPublisherFS4SP\incoming\tcm_0-286137-66560.Content.zip (The process
cannot access the file because it is being used by another process),
D:\Inetpub\TridionPublisherFS4SP\incoming\tcm_0-286137-66560.Content.zip (The process
cannot access the file because it is being used by another process)
Components/Pages are failing under stage Preparing Deployment, how should we fix it?
Do you have multiple Deployers using the same incoming location?
It looks like you’re running the Deployer as a WebApp – is the Deployer service also running on the system?
If you search for all files named “cd_deployer_conf.xml”, do they have the same incoming folder (D:\Inetpub\TridionPublisherFS4SP\incoming) defined?
Otherwise, you might use ProcMon to watch the folder and see what else is accessing the file.
If you still have this issue, you may try
1. deleting all files under incoming,
2. making sure there is no encryption enabled for the incoming folder (Some companies apply encrypt script immediately to the files that are added to the drive) or
3. making sure your antivirus is not screening that folder (As Nuno mentioned).
Do a restart of the deployer app and verify in the logs?

Error 0x800401F3 One process works one doesn't same website

I'm stuck on this one. I hope someone here has some experience with this. Here is the situation. I have set up a web page that allows users to upload flat files to be loaded into SQL Server 2005 using SSIS. There are two difference SSIS processes depending on the file type. The decision of which SSIS process to use is made by the user on the website.
Once the file is uploaded by the user the process is started by a .NET Process object. The command line is the normal command line you'd expect to see to start dtexec with a specific SSIS file and that sets a couple variables. For example:
dtexec /f /De /set value
The ASP.NET Anonymous User is running as a domain user account. All SSIS package files for both SSIS processes are in the same directory. The domain user account has full privileges on that directory. The same method in ASP.NET starts either of the processes. The only difference is the WebMethod called by the website. One WebMethod for each type. It is in these WebMethods where the unique arguments are assigned to the command line text for SSIS.
Here is where I have run into the problem. When running the website process "1", it runs fine, but process "2" fails with the error mentioned above. When I capture the Standard Output I receive this:
Microsoft (R) SQL Server Execute
Package Utility Version 9.00.4035.00
for 32-bit Copyright (C) Microsoft
Corp 1984-2005. All rights reserved.
Started: 10:34:14 AM Could not create
DTS.Application because of error
0x800401F3 Started: 10:34:14 AM
Finished: 10:34:14 AM Elapsed: 0.016
seconds
I don't understand how everything can be nearly identical yet only one will run. One final thing, both methods work fine when I am testing directly from Visual Studio. I figure it must be something with the Anonymous User account used, but I can't figure out why one process would work and the other not work when they are so similar.
Any help will be greatly appreciated.
Rob
Found the problem. The error code was a phantom. What happened was a Connection Component was being fed by a variable that was holding a path to a folder the new account could not go to. Even though in process it would be replaced with a good target it was failing in validation. This is why there was no logs. I didn't have the logging level high enough to see it and it acted like a security issue. Which is was in a way of looking at it.

Run web app code from command line?

I have an ASP.NET web application that includes code for enforcing its own database schema ; this code runs on application start.
I've recently started using LINQ to SQL, and I've added a pre-build event to run SqlMetal on my database so that I get objects representing my db tables.
What would be really cool is if I could enforce the database schema in the pre-build event, and then run SqlMetal. As it is, if the schema changes (e.g. I add a field to a table), I have to (a) build and run the website once so that application start fires and the schema is enforced, and then (b) build it again so that SqlMetal runs.
So: What are my options for running code that lives in my web application, from the command line?
Here's what we do.
We have a local 1-click build that is required to be run before check in (an integration build also runs in a separate environment every check in...).
The NANT script will:
Rebuild the database from scratch using Tarantino (Database change management)
Clean & Compile
Copy DLLs to a separate directory
Run Unit Tests against the DLLs
We have a separate script for SQL Metal, but your question is going to have me look at inserting the call between steps 1 and 2. This way your database changes and linq generated files are always in sync.
You could either write a small program that use CodeDOM to interpret a file in your repository or directly call the compiler executable inside your pre-build event.
Using CodeDOM avoid any problems with having to know where the compiler executable is but if your code can't be contained in one file without any dependency it's unusable and calling the compiler, then executing the result is a better option.

Resources