I have got a process apllication which I exported from an env. and want to import it in another env. The hogher env. already has an older version of process application running. I could not find any option in Process center to update the process application. Whenever i try to import it, it says that the application already exist. Could anyone let me know how to do it ?
On IBM BPM plateforme each application is defined by the prefix
try first to edit the one you already have then import the new one and it will work
Change the snapshot name , the server you want to import to has may already snapshot with the same name
Related
I am new to Deno. I was looking for its difference with Node Js. I found that Deno is always fetching modules online on run time from https://deno.land/.. .
But Node only used the internet during the installation of modules.
So in case if the internet is not available or with low-speed internet how we can overcome this issue in Deno?
I found that Deno doesn’t need an Internet connection once the modules are loaded.
They are cached in the folder your working in it’s the same module you’ll be using until you use the — reload flag.
So it’s practically the same with node and how package.json files work.
I think Deno is here to replace node js and the security features being its greatest assets and that’s going to be invaluable with the security treats we constantly keep facing.
The best way, IMO, is to create a deps.js file as part of your project and declare all your imports in it. You can import from there into other files and will have a single point to maintain them AND get more control whether internet will be needed or not.
The first time an import is needed Deno will fetch it and then keep it local.
If there is no internet the local copy will be used.
If you want to avoid internet do NOT use imports without a version because Deno will try to get the latest version each time. It cannot know if the version changed since last run, and hence will check.
import {isWindows} from "https://deno.land/std/_util/os.ts";
export {isWindows};
Instead use a version. If Deno sees that that version is present local it will use it. There is no ambiguity here, that version is that version.
import {isWindows} from "https://deno.land/std#0.88.0/_util/os.ts";
export {isWindows};
You can monitor the process in your console window.
Furthermore, if with internet you really mean the public internet vs just the network (e.g. LAN) then you can build up a local resource where the imports can be fetch from, which would make you independent of the internet and only dependent on the availability of your local LAN resource. I figure if that one goes down your Deno app, if for instance it is a server, cannot be reached anyway.
Will appreciate if someone can please throw some light on best way to backup and restore Artifactory. Primary concern being the repository plus user and permission are available on restore.
I am trying System Export and import feature as well as incremental backup plus import but having no success with user/permissions when restore happens. Expecting to see ALL users after import from source - but net results are opposite - after restore completes only user I see is access-admin.
I even lose the user and anonymous on destination instance.
As per Arti. doc https://www.jfrog.com/confluence/display/RTF/Importing+and+Exporting ,system export and import should take care of ething including security BUT my tests does not seem to have user info on import.
From Doc. At system level, Artifactory can export and import the whole Artifactory server: configuration, security information, stored data and metadata. This is useful when manually running backups and for migrating and restoring a complete Artif. instance (as an alternative to using database level backup and restore).
artifactory.version=5.9.3
artifactory.timestamp=1521564024289
artifactory.revision=50903900
artifactory.buildNumber=820
Default embedded derby mode . In new instance not restoring the db as per my understanding documentation says - System Backup and Restore (ALL) will take care of all configurations.
But I am new to artifactory - Please correct me if my understanding is not correct.
Do like to point out restore is tried at seperately new spun instance with jfrog ami and not on the instance from where data was backed.
This is to test if we lose our instance completely , can we spin the new aws instance and quickly restore the artifactory envoirnment back.
Thanks in advance for help.
Am trying to import MSI without binding to Biztlak Production Server 2013. And i get the following error :
Error in Importing Application
Import Wizard[21.1.2016 14:10:10]: **Change requests failed for some resources.
BizTalkAssemblyResourceManager failed to complete end type change request.
Cannot access a disposed object.
Object name: 'ServicedComponent'.
Pictures attached :
Side NOTE: Also, its not advised, but i try to delete the application and do a fresh import but i couldn't be able to delete it too.
Any pointers are appreciated. thx
Update : After Googling i found this link : https://social.msdn.microsoft.com/Forums/en-US/17eed40a-0175-407a-b450-53f3be6e087e/failed-to-add-resources-to-application-mscorlib?forum=biztalkgeneral
The problem was solved in the above link thread with ' power admin rights' .I have Global Admin rights. I can create ports and things.
Solution : There were some Admin rights missing.Make sure it has all the admin rights.Also,make sure you are importing that stuff to BizTalk as administrator (run as administrator)
Solution : I have added all the admin right (Make sure it has all the admin rights). Also,make sure you are importing MSI to BizTalk as administrator (run as administrator)
I have few problems with meteor. Is there any need to create database using "use mydb" programmatically. I didn't used it so far and i'm directly creating collections and applied CRUD operations on them. But, i saw db.collection.find() like things few times and when i'm trying to apply on my collection it is showing error like db is not initialized how to initialize it. Here my main problem is, I tried to import some content from .json file to my collection. which is possible using database only(i thought). I can import them from shell like this
mongoimport --db test --collection mobiles <products.json --jsonArray
and how to import them without db.
You would have to show some code to see what exactly the issue is.
Meteor uses MongoDB so the schema doesn't need to be strictly created for things to work, like as would be done in MySQL or a traditional SQL type database. You can just insert documents and if the collection doesn't exist, or the database doesn't exist it would be created then without explicitly being created separately.
To import your files you need to import to your meteor database running at port 3002 (if your meteor app is running on port 3000 - meteor app port + 2). Something like this should work, the database is meteor
mongoimport --db meteor --host localhost:3002 --collection mobiles --jsonArray --file production.json
(Not sure about your file structure so i'm assuming its --jsonArray --file production.json). You could check out the docs at http://docs.mongodb.org/v2.4/reference/program/mongoimport/ for more details
So again you wouldn't need to create a database when you do this, using the --db argument would load things into meteor. If it doesn't exist it would automatically create it as you use it.
I am trying a test to move all my development to Nitrous.io IDE, but with limited space in my Nitrous box I want to permanently host my Mongo databases at MongoHQ.com. Currently each day I need to set my MONGO_URL by running:
export MONGO_URL='mongodb://<user>:<pass>#paulo.mongohq.com:12345/<db>'
If I fire up another console or logout of Nitrous my MONGO_URL needs to set again.
How can I set the development MONGO_URL for good per meteor app? I cannot find a config file anywhere.
Nitrous support helped me find a quick solution. Just wanted to answer it here for others with the same issue.
Open ~/.bash_profile and enter your DB information.
example:
export MONGO_URL='mongodb://jimmy:criket#paulo.mongohq.com:12345/mynitrobox'
Next in the console run source ~/.bash_profile to load the settings.
This sets the DB for your entire node.js box, not individual meteor apps, so you may want to structure your Mongo collections accordingly with subcollections.
you can do this in one line like so:
MONGO_URL='mongodb://<user>:<pass>#paulo.mongohq.com:12345/<db>' meteor
I don't know much about Nitrous.io but in AWS EC2 I have an upstart job that runs this for me when the server starts.
I gist'd my approach a while back, I've since changed it a bit but this still works:
https://gist.github.com/davidworkman9/6466734
I don't know that this will help you in Nitrous.io though, good luck!