Copy / Move One SAP Client From One System to another System - sap-basis

We have one SAP system in the US (let's call it TKIJVPL1), this system has an SAP client 241. We have another SAP system in Germany (lets call it Lockweiler).
We need to move this client 241 from our TKIJVPL1 server to this new server.
Can I simply use transaction SCC8? It says client export, but when I look at the options, it says
Source client : 241 (which is good),
Profile Name SAP_ALL (which is also good as I need all data),
but Target System all that is coming up is PL1 / QL1.
What is the easiest way to export one client from one system to another system in SAP system?
Or I would rather export it to a hard drive, take these files place them on a DVD and mail them to Germany. But I do not see an export to the local disk transaction???

I got it working.
You still need to use transaction SCC8, it could potentially take 8-20+ hours doing a client export. As it is exporting you need to look at the folder \usr\sap\trans\data and in there SAP will create some files with the system name as an extension. In my case it created files like RT03292.PL1 and a few others.
Now one needs to take these files to do an import...
Here are some good links pertaining to this:
http://basissap.blogspot.com/2008/05/what-is-client-copy.html
http://forums.sdn.sap.com/thread.jspa?threadID=1713405&tstart=0
Make sure you look at /usr/sap/trans/data for the data files (usually start with R).
Also make sure you look at /usr/sap/trans/cofiles (usually starts with a K).

Related

After a certain limit it should point to different vm

I have a file system in which the files are stored with the numeric number (SQL index) but my VM size is full and I can't shift my files to different Cloud or anything.
My file system URL will be
https://example.com/5374/randomstring.jpg
5374 is file number which is saved in SQL DB and a random string is generated.
What I'm planning to do is using nginx redirecting right now I have 56770 in a vm if a user tries to upload it will go and save in different vm and if user wants to access 56771 means using nginx it should point to that VM.
You will make your life easier by choosing the cutoff point yourself, it's not essential but it will make matching a regular expression a lot more concise.
If you said 56000 and above was on VM2 then your regex is as simple as /([5-9][6-9][0-9][0-9][0-9])/

SQLite through Mainframe

We have a batch Mainframe JCL and SQLite file in windows Share Path
We need the data in windows share path periodically updated based on a mainframe computation.
So We need to create records in the SQLite database based on Mainframe JCL/Cobol program and then manipulate it using SQLite.
Is this feasible? We are not able to find any leads on how to make use of SQLite from a Mainframe stand point. Any information would be very helpful.
Someone's probably going to have to write a CICS routine for you. It might be a better idea to have a program run at your end at the set time(s) and invoke the Mainframe CICS program through yours using web services.
Since the question says that you're dependent on Mainframe calculations, you will have to make sure that you call the CICS program with all the required parameters and values or make sure that it can fetch those natively. Have the CICS program do the computations for you and return the results.
It might also be possible that what you refer to as "Mainframe JCL / COBOL program" (i.e. batch) already has a CICS (online) counterpart and you wouldn't have to write (or make someone write for you) a new routine again. Your Mainframe team should be able to confirm.
You can create an SSH server and serve the data from a file on the HFS side.
You can also FTP to a Wintel Stack (with NETRC DDNAME).
Yes you can serve WEB/REST from CICS as well (overkill), or use MQSeries (ditto), or even SMTP. Scrape 3270, EHLLAPI (obscure), third party products like XCOM.
Again the USS (OMVS) side, you should be able to sftp (-b) with a script. BPXBATCH is your friend....
A great many shops have been doing these things and more for a very long time.

What would be the best practice downloading all the files from a directory using Sftp

I would like to implement the following functionality:
downloading all the files from a specified remote directory to a local directory.
after downloading all the files I need a list file which contains all the downloaded files.
(I only want this list file when all the files were downloaded successfully.)
Point 1:
Let's say we have around 10 files in the remote directory.
I can use an int-sftp:inbound-channel-adapter component to download all the files but 10 poll cycles are needed to download all of them since the inbound component is only able to download 1 file per poll request.
Spring Integration creates 10 File messages one by one.
Questions:
How can I identify the last file (message) received from the FTP server?
I don't want let users access to list file till all the files from the FTP is successfully received.
How can I achive this?
I can write file names into a list file using the int-file:outbound-channel-adapter but users can read temorary information from that file before the download process is finished.
How can I trigger the event that all files which are on the FTP are downloaded?
Thanks for your advices
Ferenc
First of all this isn't correct:
the inbound component is only able to download 1 file per poll request
You can configure it to to download infinitely during the single poll - max-messages-per-poll=-1. Anyway it is a default option on <poller>.
Anyway if it is your case to dowload one file per poll, you can go ahead with that requirements.
Since any Messaging system tries to achieve stateless paradigm, it is normal that one message doesn't know anything about another. And with that they all don't impact each other. The async scenario is the best for Messaging. With that we can process the second message quicker, than the first one.
Your requirement is enough interest and I won't dare to call it strange. Because any business may have place.
Since you are going to process several download files as one group, there will be need to have some marker on the remote server. Or it can be some timeframe, which we can extract from file timestamp. Or there will be need to store on the remote server some marker file to point that a set of files are finished and you can process them from your application using their local version. Would be great, if that marker file can contain a list of file names of that group.
Otherwise we don't have any hook to group messages for those files.
From other side you can consider to use <int-sftp:outbound-gateway> with MGET command: http://docs.spring.io/spring-integration/docs/latest-ga/reference/html/sftp.html#sftp-outbound-gateway

Using an encrypted file securely

I'm writing an application with a dBASE database file in Borland Delphi 7.
Note: I think this question is file-security related and you can forget the dBASE thing (consider it as a TXT file) in this question.
The database must be accessed just by the application. Then it must be encrypted. Unfortunately dBASE doesn't support any password mechanism and i had to encrypt the file by myself (and i also HAVE to use dBASE)
What approach do you suggest to secure the database file?
The simple one is:
Encrypting the database file and placing it near beside the application EXE file.
When the application runs, it should decrypt the file (with a hard-coded password) and copy the result to a temporary file that has DeleteOnClose and NoSharingPermission flags.
When Closing, application should encrypt the temp dBASE file and replaces the old encrypted file with the new one.
I think this is a fair secure approach. But it have two big problems:
With an undelete tool the user can restore and access to the deleted temp file.
Worse: When application is running, if the system rebooted suddenly the DeleteOnClose flag fails and the temp file remains on hard disk and user can access it.
Is there any solution for, at least, the second part?
Is there any other solution?
You could also try to create a TrueCrypt file-based containter, mount it, and then put the dBase file inside the mounted encrypted volume. TrueCrypt is free (in both senses) and it's accessible via command line parameters from your application (mount before start, unmount before quit).
Depending on what you're doing with the database, you may be able to get away with just decrypting the records you actually need. For example, you could build indexes based on hash codes (rather than real data); this would reduce seeks into the database to a smaller set of data. Each record in the subset would have to be decrypted, but this could be a lot better than decrypting the entire database.

Programmatically open an email from a POP3 and extract an attachment

We have a vendor that sends CSV files as email attachments. These CSV files contain statuses that are imported into our application. I'm trying to automate the process end-to-end, but it currently depends on someone opening an email, saving the attachment to a server share, so the application can use the file.
Since I cannot convince the vendor to change their process, such as offering an FTP location or a Web Service, I'm stuck with trying to automate the existing process.
Does anyone know of a way to programmatically open an email from a POP3 account and extract an attachment? The preferred solution would reside on a Windows 2003 server, be written VB.NET and secure. The application can reside on the same server as the POP3 server, for example, we could setup the free POP3 server that comes with Windows Server and pull against the mail file stored on the file system.
BTW, we are willing to pay for an off-the-shelf solution, if one exists.
Note: I did look at this question but the answer points to a CodeProject solution that doesn't deal with attachments.
Try Mail.dll email component, it's very affordable, supports attachments national characters and is easy to use, it also supports SSL:
Using pop3 As New Pop3()
pop3.Connect("mail.server.com")
pop3.Login("user", "password")
Dim builder As New MailBuilder()
For Each uid As String In pop3.GetAll()
' Receive email message'
Dim mail As IMail = builder.CreateFromEml(pop3.GetMessageByUID(uid))
'Write out received message'
Console.WriteLine(mail.Subject)
'Here you can use mail.Attachmets collection'
For Each attachment As MimeData In mail.Attachments
Console.WriteLine(attachment.FileName)
attachment.Save("c:\" + attachment.FileName)
' you can also use attachment.Data here'
Next attachment
Next
pop3.Close(true)
End Using
You can download it here: http://www.lesnikowski.com/mail.
possible duplication of Reading Email using Pop3 in C#
Atleast, there's a shed load of suggestions there that you may find useful
I'll throw in a late suggestion for a more generalized "download POP3 messages and extract attachments" solution using existing software and minimal programming. I needed to do this for a client who switched to receiving faxes via email and was not pleased with manually saving the attachments to a location where they could be imported into an application.
For downloading messages on *nix systems fetchmail seems to be the standard and is very capable, but I chose mpop for both simplicity and Windows compatibility (but it is cross-platform). If mpop hadn't done the trick for me, I probably would have ended up doing something with the Python-based getmail, which was created when fetchmail's development stalled for a time (it's since resumed).
Mpop is controlled either via command line or configuration file, so I simply created multiple configuration files and specify via command line which file to load. I'm using it in "Exchange pickup directory" mode, which means it simply downloads the messages and drops them as text (.eml) files in a specified directory.
For extraction of the message attachments, UUDeview appears to be the standard (I'm using the Windows port of UUDeview) across just about any system you could want with just about any features you could want. My main alternative to this was a much-less-capable Python script that I'd developed for a different client back in 2007, but I'm happy to go with a precompiled executable over either installing Python or packaging with any of the Python-to-exe options.
Finally there's the configuration - along with the two mpop configuration files mentioned above (which I could do away with by using command-line options), I also have two 2-line .cmd files launched every 10 minutes by scheduled task - the first line to launch mpop to download into a working directory and the second line to launch UUDeview and extract attachments of specified types (.pdf or .tif) then delete each file from which it extracted attachments. Output is sent to another directory from which staff can directly attach files as needed.
This is overall not the most elegant way to reach these ends, but it was quick, simple, functional and reasonably robust - at each stage if something goes wrong it fails such that no data is lost. The only places where data could be lost are any non-attachment messages being sent to the dedicated fax email addresses, and even those will sit in the processing directory and be caught eventually.

Resources