What Content-Transfer-Encoding should I use when sending a PHP mail() containing a long http-link? - content-type

I've got a script that sends e-mails that looks kinda like this:
$headers = "From: test#example.com\r\n";
$headers .= "Reply-To: test#example.com\r\n";
$headers .= "MIME-Version: 1.0\r\n";
$headers .= "Content-type: text/plain; charset=utf-8\r\n";
$headers .= "Content-Transfer-Encoding: 8bit";
$orgsubject = "A subject with some swedish characters like å, ä and ö";
$newsubject='=?UTF-8?B?'.base64_encode($orgsubject).'?=';
$body = 'A lot of text.
Some more text.
A long URL:
http://example.com/subpage/?id=1234&hash=23jh4lk2j3h4lkjh674598xzxlk2j34h&anotherhash=h2k3j4h23kh42kj34h2lk3';
It's tested thoroughly but some users, I think Outlook users, get a URL looking like this:
http://example.com/subpage/?id=3D1234&hash=3D3D23jh4lk2j3h4lkjh674598xzxlk2j34h&anotherhash=3Dh2k3j4h23kh42kj34h2lk3
The equal signs is now followed by '3D' which makes the URL useless in my case. I guess this has something to do with the Content-Transfer-Encoding or perhaps the Content-type, do I need to encode the body message to base64 or something?
Update
Just found this forum post: https://stackoverflow.com/a/7289434/513321
So I removed the Content-Transfer-Encoding and it seems to work fine but on the other hand I could never reproduce the error where the URL contained the '3D' text, so I can't be certain that this will work. Does anyone know if removing the Content-Transfer-Encoding would solve my problem?

There’s no mention of this in your question, but the =3D is an escape sequence of the quoted printable transfer encoding; it’s used to safely transfer 8-bit data using a 7-bit encoding.
There are still mail servers in existence that don’t like lines longer than 76 columns, and intermediate servers MAY mangle your message without updating the message headers, resulting in the observed behaviour.
Since 5.3.0, you can use quoted_printable_encode() to encode your message and set the Content-Transfer-Encoding header accordingly:
Content-Transfer-Encoding: quoted-printable
Setting an explicit transfer encoding is a good practice you should adopt rather than a naive “somehow it works” approach :)

Related

Trying to pass a JSON string with an em dash from Tcl to browser and it fails to parse?

I've been going in circles a bit the past two days with this simple task due to lack of knowledge in a couple areas; so, this may sound like a repeat of a question from yesterday but it is not.
I had the issue of having a string of plain text stored in SQLite containing an em dash represented by three bytes 226 128 148. I was trying to search the string for the dash by typing it in the text editor, and could not locate it.
After much help on SO, I learned that the plain text could easily be changed to UTF-8; and I did that and updated the SQLite table column. It now appears as code 8212 if run scan $c %c.
Now, the dash is displayed even in the command line interface when run SQLite from there for testing.
However, I can no longer send the record to a web browser to be displayed because of the following error. SyntaxError: JSON.parse: bad control character in string literal at line 1 column 49 of the JSON data localhost:8000:233:25 Of course, column 49 is the dash.
If I send back the previous version before updating the string from plain text to UTF-8, there is not an error in the browser and the string is displayed as expected.
Likely, this is a very stupid question but I don't know what I'm doing wrong or not doing. Thank you for any guidance you may be able to provide.
I updated one record using a SQLite function.
dbt function decodeUTF -argcount 1 -deterministic -directonly -returntype text { encoding convertfrom utf-8 }
set sql {select decodeUTF(text_content) from tablename where conditions...}
dbt eval $sql
And then retrieved the new value and sent it to a browser using some Tcl code as a very limited local server.
proc GetSQL {sock} {
chan flush $sock
set sql {select text_content from tablename where conditions ... }
dbt eval $sql {
set result "{\"result\":\"$text_content\"}"
}
set headers ""
append headers "HTTP/1.1 200 OK\n"
append headers "Content-Type: application/json; charset: utf-8\n"
append headers "Content-Length: [string length $result]\n"
append headers "Connection: Keep-Alive\n"
puts $sock $headers
chan configure $sock -translation binary
puts $sock $result
}
The GET request is made using fetch from the browser page script and return response.json() is where I think it fails.
The encoding convertfrom utf-8 when storing (your decodeUTF) needs to be matched by an encoding convertto utf-8 $result upon retrieval, before returning it to the client via your binary-only channel:
puts $sock [encoding convertto utf-8 $result]

wp_mail() transforms "-" to "–" in email subject

I am using the wp_mail() to send emails from a custom WordPress plugin.
I am trying to figure out why the emails sent result in some non-alphanumeric characters in the email subject being changed? For example, a subject such as "Word1 - Word2" will be received as "Word1 – Word 2", which doesn't look good at all.
The code looks like this:
$subject = 'word1 - word2';
$msg = 'message';
$headers = 'Content-Type: text/html; charset=utf-8';
wp_mail('a#b.com', $subject, $msg, $headers);
The email subject shows "Word1 – Word 2" in Gmail. I know it has to do with encoding, but does anyone know how to fix this?
Thanks!
E28093 8211=x2013 [–] ON EN DASH
– is an "html entity". There is a whole set of these allowing you to encode any fancy character for web pages, using only plain Ascii characters.
It is also the Unicode "codepoint" 8211 (decimal) or 2013 (hex). And it can be encoded in most places using the 3 utf-8 bytes hex E28093
The sender had a way of encoding an EN dash instead of a plain dash -.
Quite possibly wp_mail deliberately encoded any non-ascii characters in order to avoid strange things happening if it were to be rendered on a web page.
On any web page – will render as –
"Edit" my answer to see that that is exactly what I did. (Note also that ` on this forum inhibits the rendering.)

how to send a multi-part POST with curl without knowing total size of input

I am working on a project which involves sending voice over http stream, i am currently using CURL for my Http backend. I see that if i need to use "Transfer-Encoding: chunked" i need to mention the total stream size/"Content-length:" . I am currently waiting for the stream to complete from which i will know the total content size . which works but is causing significant delay . i would like to know how can i upload the data in chunks without knowing the total content length of the input.
curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "audio",
CURLFORM_CONTENTTYPE, MULTI_PART_CONTENT_TYPE_AUDIO,
CURLFORM_STREAM, &(*(aBufffer)),
CURLFORM_CONTENTSLENGTH,bufferSize,
CURLFORM_END);
the documentation for "CURLFORM_STREAM" specifies that it is mandatory to specify "CURLFORM_CONTENTSLENGTH" . I need to use "CURLFORM_STREAM" because my buffer is big and I want curl to call "CURLOPT_READFUNCTION" to post the remaining data.
looking at the http request header specification for content-length and about the message body indicate that this header could be optional for POST request as long as the Transfer-Encoding header is specified.
The server would look for a message body if either one of the header is present.
The problem is that you would have to find a way to figure out on the server if the message have been fully received.
Tell libcurl to do the POST using chunked encoding by setting the header. See example below. You can then simply lie and set CURLFORM_CONTENTSLENGTH to some non-zero value since libcurl won't pass on a Content-Length: in its request anyway.
struct curl_slist *headerlist =
curl_slist_append(NULL, "Transfer-Encoding: chunked");
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist);
/* pass in the created formpost data */
curl_easy_setopt(curl, CURLOPT_HTTPPOST, formpost);
/* send the entire thing away */
curl_easy_perform(curl);

Attachment in html formatted mail in unix

1. (cat mytest.html;uuencode "myfile.xls" "myfile.xls")|mail -s "$("This is Subject\nContent-Type: text/html")" test#yahoo.com
2. (uuencode "myfile.xls" "myfile.xls")|mail -s "$("This is Subject\nContent-Type: text/html")" test#yahoo.com < mytest.html
When I am using above 2 methods, output is coming with html formatted. But I am not getting any attachment?(Where mytest.html contains the html part)
Note: I am getting some scattered character in place of attachment.
Please get me out of here
uuencode was an old standard for encoding binary data as ASCII text for inclusion in mail and news articles but it has been obsolete and not in common use for more than a decade. There are probably no remaining MUAs that still know how to process it, especially in HTML mail.
Also, your trick of specifying the Content-Type header to the -s argument of the mail command is a very ugly hack. I'm surprised it works at all! In any case, it fails to include at least one other required header: MIME-Version: 1.0.
You need to build a MIME multipart message with one part being your HTML document, and the other part being your attachment (probably base64 encoded if it's binary data).
Because MIME requires you to choose a multipart boundary, format the body of the mail to delimit the multiple parts using that boundary, generate headers for each of the multipart subparts (including each part's own Content-Type and possibly Content-Transfer-Encoding and Content-Disposition or others), and encode each part appropriately, you're much better off using a toolkit that constructs MIME messages for you rather than trying to do it manually through the mail command. If you are working in the shell, you might try makemime but that's almost as ugly as doing it manually so I'd suggest using something like Perl's MIME-Tools.

failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request

I am accessing images from another website. I am getting this Error:
"failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request " error when copying 'some(not all)' images. here is my code.
$img=$_GET['img']; //another website url
$file=$img;
function getFileextension($file) {
return end(explode(".", $file));
}
$fileext=getFileextension($file);
if($fileext=='jpg' || $fileext=='gif' || $fileext=='jpeg' || $fileext=='png' || $fileext=='x-png' || $fileext=='pjpeg'){
if($img!=''){
$rand_variable1=rand(10000,100000);
$node_online_name1=$rand_variable1."image.".$fileext;
$s=copy($img,"images/".$node_online_name1);
}
I think preg_replace make more better sense as it will work with latest versions of the PHP as ereg_replace didn't worked for me being deprecated
$url = preg_replace("/ /", "%20", $url);
I had the same problem, but it was solve by
$url = str_replace(" ", "%20", $url);
Thanks Cello_Guy for the post.
The only issue I can think of is spaces being in the url, most likely in the file name. All spaces in a url need to be converted to their proper encoding, which is %20.
If you have a file name like this:
"http://www.somewhere.com/images/img 1.jpg"
You would get the above error, but with this:
"http://www.somewhere.com/images/img%201.jpg"
You should have to problems.
Just use the str_replace() to replace the spaces (" ") for their proper encoding ("%20")
It looks like this:
$url = str_replace(" ", "%20", $url);
For more information on the str_replace() check out The PHP Manual.
Use the function rawurlencode()
Encodes the given string according to » RFC 3986.
http://php.net/manual/ru/function.rawurlencode.php
Even a trailing blank in the url can cause php file($url) to fail. In recent versions of php or apache even a trailing blank in the url will cause the error. So the url appears to work in a browser because the browser knows enough to %20 the trailing blank or ignore it. That was my error anyway.
Older LAMP allowed it. (ie. same code ran ok). Easy fix.
It seems that the URL has some spaces or other special characters, you need to encode it, use urlencode() function

Resources