Python won't download the entire file - python-3.4

while Fileupdate <= Filecount:
print(Fileupdate)
url = 'http://www.whatever.com/photo/'+str(Fileupdate)+'.jpg'
try:
a = urllib.request.urlopen(url)
except urllib.error.HTTPError as err:
if err.code == 404:
Fileupdate = Fileupdate + 1
continue
urllib.request.urlretrieve(url, str(Fileupdate)+'.jpg')
Fileupdate = Fileupdate + 1
continue
It will search through the website and identify the web address to download the file from but starts the download and then freezes everytime after downloading 262.1kb of the file. It won't download the rest of the file and it won't continue searching through the rest of the series. Wish I hadn't lost my old code anymore...like a dummy I just started saving over top the old code. At least it worked with a few flaws that I could probably now correct. This isn't working worth a crap.

Related

Requests(url) is having after 5 iteration

I am attempting to run a webscraping algo on indeed using beautifulSoup and loop through the different pages. However, after 2-6 iterations, the requests.get(url) hangs and stops finding the next page. I have read that it might do something with the server being blocked but that would have blocked the original requests and it also says online that Indeed allows for web scraping. I have also heard that I should set a header but I am unsure how to do that. I am running on the latest version of safari and MacOs 12.4.
A solution I came up with, thought this does not answer the question specifically, is by using a try expect statement and setting a timeout value to the request. Once the timeout value is reached, it enters the try except statement, sets a boolean value, and then continues the loop and try again. Code is inserted below.
while(i < 10):
url = get_url('software intern', '', i)
print("Parsing Page Number:" + str(i + 1))
error = False
try:
response = requests.get(url, timeout = 10)
except requests.exceptions.Timeout as err:
error = True
if error:
print("Trying to connect to webpage again")
continue
i += 1
I am leaving the question as unanswered for now however as I still don't know the root cause of this issue and this solution is just a workaround.

Lua - Handle a 301 Moved Permanently error and then save generated image from resulting URL

I’m trying to make a http.request to have a graph created, and then save the resulting .png graph image that is created. The problem is I want to do this with Lua, yet I’m struggling on two parts. (If you take url, you’ll see that this should work fine in a standard browser)
Handling a 301 error, have looked through SO, I could see a few references to this and the need to use luasec, which I believe I have.
301 moved permanently with socket.http
Here is the script, with the URL I’m trying to call via HTTP, and then (eventually want to ) save the resulting graph image (.png file) that’s created
local http = require "socket.http"
--local https = require("ssl.https")
local ltn12 = require "ltn12"
r = {} -- init empty table
local result, code, headers, status = http.request{
url="http://www.chartgo.com/create.do?charttype=line&width=650&height=650&chrtbkgndcolor=white&gridlines=1&labelorientation=horizontal&title=Fdsfsdfdsfsdfsdfsdf&subtitle=Qrqwrwqrqwrqwr&xtitle=Cbnmcbnm&ytitle=Ghjghj&source=Hgjghj&fonttypetitle=bold&fonttypelabel=normal&gradient=1&max_yaxis=&min_yaxis=&threshold=&labels=1&xaxis1=Jan%0D%0AFeb%0D%0AMar%0D%0AApr%0D%0AMay%0D%0AJun%0D%0AJul%0D%0AAug%0D%0ASep%0D%0AOct%0D%0ANov%0D%0ADec&yaxis1=20%0D%0A30%0D%0A80%0D%0A90%0D%0A50%0D%0A30%0D%0A60%0D%0A50%0D%0A40%0D%0A50%0D%0A10%0D%0A20&group1=Group+1&viewsource=mainView&language=en§ionSetting=§ionSpecific=§ionData=",
sink = ltn12.sink.table( r )
}
print("code=".. tostring(code))
print("status=".. tostring(status))
print("headers=".. tostring(headers))
print("result=".. tostring(result))
print("sink= ".. table.concat( r, "" ) )
print(result, code, headers, status )
for i,v in pairs(headers) do
print("\t",i, v)
end
Which returns the 301 Moved Permanently error, plus via a viewer it also provides me with a link to another URL (this time a https on)
So to try and get to the https site first off, I tried adding in the ssl.http element with the following, but that that does not return anything at all, all nil .
local https = require("ssl.https")
local ltn12 = require "ltn12"
r = {} -- init empty table
local result, code, headers, status = https.request{
url="https://www.chartgo.com/create.do?charttype=line&width=650&height=650&chrtbkgndcolor=white&gridlines=1&labelorientation=horizontal&title=Fdsfsdfdsfsdfsdfsdf&subtitle=Qrqwrwqrqwrqwr&xtitle=Cbnmcbnm&ytitle=Ghjghj&source=Hgjghj&fonttypetitle=bold&fonttypelabel=normal&gradient=1&max_yaxis=&min_yaxis=&threshold=&labels=1&xaxis1=Jan%0D%0AFeb%0D%0AMar%0D%0AApr%0D%0AMay%0D%0AJun%0D%0AJul%0D%0AAug%0D%0ASep%0D%0AOct%0D%0ANov%0D%0ADec&yaxis1=20%0D%0A30%0D%0A80%0D%0A90%0D%0A50%0D%0A30%0D%0A60%0D%0A50%0D%0A40%0D%0A50%0D%0A10%0D%0A20&group1=Group+1&viewsource=mainView&language=en§ionSetting=§ionSpecific=§ionData=",
sink = ltn12.sink.table( r )
}
print("code=".. tostring(code))
print("status=".. tostring(status))
print("headers=".. tostring(headers))
print("result=".. tostring(result))
print("sink= ".. table.concat( r, "" ) )
print(result, code, headers, status )
And then …
assuming I can eventually make the http.request work, the web page returns a png. image of the resulting graph - I’d love to be able to extract/copy that for further use within this piece of code..
As always any help/advice would be appreciated..

applescript - quicktime 7 audio (wav) export

I'm working on a script to help my workflow. My work involves sound design, and I often have to take videos and extract the audio. Regardless of the source/compression, I like it in .wav format - best quality, accepted by all audio editing software, and least overhead for playback in a live environment.
Currently, I use Quicktime Pro 7's Export feature for this task - the current Quicktime X doesn't export to .wav. Built into the OS, so instead of using a separate tool, I'm using QT.
I am using Automator to write a service - select the file, open it in QT, export as wav and save it in the same location as the original, then quit. Here is what I have so far, and I keep getting an error. "The action “Run AppleScript” encountered an error." It compiles properly, but nothing comes out. Late last night I got it to spit out .mov files for some reason (despite trying to tell it wave) and now I can't even get back there.
Any help is appreciated. As you can see from the commented parts, I was trying to specify anywhere, since trying to make it the same location as the original was escaping me. Currently I'm just having it prompt me on where to save, so I can just tackle one problem at a time. Cheers!
tell application "QuickTime Player 7"
--set saveFile to POSIX path of (((path to desktop) as Wave) & "test.wav")
set outfile to choose file name with prompt "Save altered file here:"
set error_states to {load state unknown, load error}
set successful_states to {loaded, complete}
repeat until load state of first document is in (error_states & successful_states)
delay 0.1
end repeat
tell document to save in outfile
if (load state of first document is in successful_states) then
if (can export first document as wave) then
export first document to outfile as wave
else
error "Cannot export " & (source_file as string) & " in .wav (Wave) format."
end if
else
error "File is not in a successful load state: " & (load state of first document as string)
end if
end tell
The file output should have '.wav" extension. To do so, I changed a bit your script and I remove the "tell document to save in outfile".
if you really want to delete your initial file after export, you need to add that at end of the script (I prefer to check the export is successful before deletion !).
Also, you must change the first line to replace the choose file (for my tests) to the read input variable from Automator service script :
set InFile to choose file "select video file for test only" -- to be replace by the input of your Automator Service
-- conversion of selected file with proper folder and new '.wav" extension
tell application "Finder"
set FFolder to (container of InFile) as string
set FName to name of InFile
set FExt to name extension of InFile
end tell
set Pos to offset of FExt in FName
if Pos > 0 then --change extension from current to 'wav'
set OutFile to FFolder & (text 1 thru (Pos - 1) of FName) & "wav"
else
set OutFile to FFolder & FName & ".wav"
end if
tell application "QuickTime Player 7"
activate
open InFile
set error_states to {load state unknown, load error}
set successful_states to {loaded, complete}
repeat until load state of first document is in (error_states & successful_states)
delay 0.1
end repeat
if (load state of first document is in successful_states) then
if (can export first document as wave) then
export first document to OutFile as wave
else
error "Cannot export " & (source_file as string) & " in .wav (Wave) format."
end if
else
error "File is not in a successful load state: " & (load state of first document as string)
end if
end tell

Uploads not working properly NGINX + Passenger + Carrierwave + Carrierwave_backgrounder

I have a Rails 4.0.0 app setup with a model called episode which mounts a carrierwave uploader called file_uploader to upload mp3s. I got my app setup using carrierwave_backgrounder and resque to background the processing of the uploaded files which are saved to an sftp server using the carrierwave-ftp gem. On my local machine it works great. Also on my vps (CentOS 6) it works great when I just start up the app using rails s or even rails s -e production. However when I switch to nginx + passenger, it no longer works as expected.
The files are uploaded to the /public/uploads/tmp dir where they are supposed be stored temporarily, but they never get moved into the upload dir that I have specified and none of the other post-processing stuff gets done, like setting content type, removing cache dirs, setting file size and length, etc.
So, yesterday, I switched from using the carrierwave_backgrounder command save_in_background to process_in_background and now it works fine for files stored locally, however, when I switch to sftp storage using the carrierwave-ftp gem, the files get processed, i.e., they are transferred to my sftp server and the path is stored in my model, but then the job hangs in the Resque queue.
The relevant code that is not getting executed is:
process :set_content_type
process :save_content_type_duration_and_size_in_model
Does anyone have any idea why this would work fine using development mode and even production mode but not using nginx + passenger?
Here's all the relevant code below:
episode.rb:
class Episode < ActiveRecord::Base
require 'carrierwave/orm/activerecord'
# require 'mp3info'
mount_uploader :file, FileUploader
process_in_background :file
belongs_to :podcast
validates :name, :podcast, :file, presence: true
default_scope { order("created_at DESC") }
scope :most_recent, ->(max = 5) { limit(max) }
end
file_uploader.rb:
# encoding: utf-8
class FileUploader < CarrierWave::Uploader::Base
include CarrierWave::MimeTypes
include ::CarrierWave::Backgrounder::Delay
storage :sftp
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"#{model.podcast.name.to_s.downcase.parameterize}"
end
before :store, :remember_cache_id
after :store, :delete_tmp_dir
# This is the relevant code that is not getting executed
process :set_content_type
process :save_content_type_duration_and_size_in_model
def save_content_type_duration_and_size_in_model
model.content_type = file.content_type if file.content_type
model.file_size = file.size
Mp3Info.open(model.file.current_path) do |media|
model.duration = media.length
end
end
# store! nil's the cache_id after it finishes so we need to remember it for deletion
def remember_cache_id(new_file)
#cache_id_was = cache_id
end
def delete_tmp_dir(new_file)
# make sure we don't delete other things accidentally by checking the name pattern
if #cache_id_was.present? && #cache_id_was =~ /\A[\d]{8}\-[\d]{4}\-[\d]+\-[\d]{4}\z/
FileUtils.rm_rf(File.join(root, cache_dir, #cache_id_was))
end
end
end
config/initializers/carrierwave_backgrounder.rb:
CarrierWave::Backgrounder.configure do |c|
c.backend :resque, queue: :carrierwave
end
config/initializers/carrierwave.rb:
CarrierWave.configure do |config|
config.sftp_host = "ftphost.com"
config.sftp_user = "ftp_user"
config.sftp_folder = "ftp_password"
config.sftp_url = "http://url.com"
config.sftp_options = {
:password => "ftp_password",
:port => 22
}
end
I'm starting Resque with the command: QUEUE=* bundle exec rake environment resque:work &
If you need more info, just ask. Any help would be greatly appreciated.
UPDATE: Well, oddly enough as is often the case, it is now magically working. Not sure what did the trick, so I'm afraid this won't be of any help to anyone else who stumbles on this page.
i have the same issue. My process blocks run in development (rails s) but not under apache2/passenger. It's not pretty, but the way i solved it was to move my process code into the after :cache callback. The process blocks are called between the after and before cache callbacks so this seemed reasonable to me.
Here's the super weird part: I don't mean to call the functions, i mean to copy the code out of your process blocks (or functions) and paste directly into your after_cache callback.
I know i'm doing something wrong to cause this situation but i cannot figure it out. Hope this helps you.
version :office_preview
# comment out the following since it does nothing under Passenger
#process :office_to_img
end
def office_to_img
this won't be called under passenger :(
end
after :cache, :after_cache
def after_cache(file)
#for some reason, calling it here doesn't do anything
#office_to_img
code copied&pasted here from office_to_img
end

How do you transfer a binary file via Connect:Direct NDM?

I'm trying to submit a binary file, in this case, an Excel file from my local server (Solaris server with Mainframe rehosting software) using Connect:Direct NDM to a destination server (Mainframe).
Here are the environment values I set:
SODETFL "DetailedReport.xls"
SODDETNDM "FIN.REPORT(+1)"
TDCOPTS ":DATATYPE=BINARY:XLATE=NO:STRIP.BLANKS=NO"
Here is the NDM configuration I use:
ASSGNDD ddname='SYSIN' type='INSTREAM' << !
SIGNON 00260005
SUBMIT PROC=COPYFILE - 00270005
JOBNAME=JOB00001 - 00280005
PNODE=SERVER001 - 00290005
SNODE=NDMIDS - 00300005
SNODEID=(xxxxxx,xxxxxx) - 00310005
HOLD=NO - 00320005
NOTIFY=CCACTD - 00330005
NODE=, - 00360005
DSN1=${SODDETFL} - 00370005
DSN2=${SODDETNDM} -
DCBINFO='dcb=(dsorg=ps, recfm=vb, lrecl=1504)' - 00385005
DISP1=NEW, - 00390005
DISP2=CATLG,DELETE - 00400005
UNIT=BATCH - 00410005
SYSOPTS=${TDCOPTS} - 00440005
AEFAJOB=PSIAPNB5
SEL PROC WHERE (QUEUE=A) TABLE 00450005
SIGNOFF 00460005
I'm able to send text files via NDM all day long, no problems there. However, it seems that binary is a bit more difficult. When I try with the above configuration, I get the following error:
Completion Code => 8
Message Id => XCPS009I
Short Text => Read buffer too small. Possibly src reclen > dest reclen.
Ckpt=>Y Lkfl=>N Rstr=>N Xlat=>Y Scmp=>N Ecmp=>Y Ecpr=>0.00 CRC=>N Zlvl=>1 win=>13 Zmem=>4
Can anyone shed some light as to how I can go about submitting a binary file via NDM?
Off the cuff...
Try changing RECFM=VB to RECFM=U and specify a BLKSIZE= instead of a LRECL=
This is really not all that different from how executable load modules are stored on the mainframe except you don't want the file to be a PDS dataset. I'm not at my office right now and I think I have some examples of NDM that transmit load modules that I can look-up if this suggestion doesn't work but I think it will.
Give this suggestion a shot and if it still doesn't fly let me know.

Resources