How to write `cat` in Go using pipes - unix

I have an implementation of the Unix tool cat below. It reads a number of bytes from os.Stdin into a buffer, then writes those bytes out to os.Stdout. Is there a way I can skip the buffer and just pipe Stdin directly to Stdout?
package main
import "os"
import "io"
func main() {
buf := make([]byte, 1024)
var n int
var err error
for err != io.EOF {
n, err = os.Stdin.Read(buf)
if n > 0 {
os.Stdout.Write(buf[0:n])
}
}
}

You can use io.Copy() (Documentation here)
Example:
package main
import (
"os"
"io"
"log"
)
func main() {
if _, err := io.Copy(os.Stdout, os.Stdin); err != nil {
log.Fatal(err)
}
}

For example,
package main
import (
"io"
"os"
)
func main() {
io.Copy(os.Stdout, os.Stdin)
}

Related

Get Request in Golang

this is a textbook example I try to put in use.
I get "BAD" as a result, it means that resp is nil, though I don't know how to fix it.
package main
import (
"fmt"
"io/ioutil"
"log"
"net/http"
)
func main() {
resp, _ := http.Get("http://example.com/")
if resp != nil {
body, _ := ioutil.ReadAll(resp.Body)
fmt.Println(string(body))
resp.Body.Close()
} else {
fmt.Println("BAD")
}
}
I would recommend to check your Internet settings first, as I cannot reproduce the problem.
Also, error handling in Go is crucial, so change your code to the one below and see if you get any error when making the request.
package main
import (
"fmt"
"io/ioutil"
"log"
"net/http"
)
func main() {
resp, err := http.Get("http://example.com/")
if err != nil {
log.Fatalln(err)
}
if resp != nil {
body, err := ioutil.ReadAll(resp.Body)
if err != nil {
log.Fatalln(err)
}
fmt.Println(string(body))
resp.Body.Close()
} else {
fmt.Println("BAD")
}
}

Function that returns Reader to http.Response

This is a stripped-down version of the code I want to use for a page-specific web crawler. The idea is to have a function that gets a URL, deals with HTTP and returns a Reader to the response body http.Response:
package main
import (
"io"
"log"
"net/http"
"os"
)
func main() {
const url = "https://xkcd.com/"
r, err := getPageContent(url)
if err != nil {
log.Fatal(err)
}
f, err := os.Create("out.html")
if err != nil {
log.Fatal(err)
}
defer f.Close()
io.Copy(f, r)
}
func getPageContent(url string) (io.Reader, error) {
res, err := http.Get(url)
if err != nil {
return nil, err
}
return res.Body, nil
}
The response body is never closed, which is bad. Closing it inside of the getPageContent function won't work, of course, for io.Copy won't be able to read anything from a closed resource.
My question is rather of general interest than for the specific use case: How can I use functions to abstract the gathering of external resources without having to store the whole resource in a temporary buffer? Or should I better avoid such abstractions?
As pointed out by the user leaf bebop in the comment section, the function getPageCount should return an io.ReadCloser instead of just an io.Reader:
package main
import (
"io"
"log"
"net/http"
"os"
)
func main() {
const url = "https://xkcd.com/"
r, err := getPageContent(url)
if err != nil {
log.Fatal(err)
}
defer r.Close()
f, err := os.Create("out.html")
if err != nil {
log.Fatal(err)
}
defer f.Close()
io.Copy(f, r)
}
func getPageContent(url string) (io.ReadCloser, error) {
res, err := http.Get(url)
if err != nil {
return nil, err
}
return res.Body, nil
}
Another solution is you can directly return the response and close it in main function. In general you can put checks on response StatusCode etc. if new requirements come. Here is the updated code:
package main
import (
"io"
"log"
"net/http"
"os"
)
func main() {
const url = "https://xkcd.com/"
r, err := getPageContent(url)
if err != nil {
log.Fatal(err)
}
defer r.Body.Close()
if r.StatusCode !=http.StatusOK{
//some operations
}
f, err := os.Create("out.html")
if err != nil {
log.Fatal(err)
}
defer f.Close()
io.Copy(f, r.Body)
}
func getPageContent(url string) (*http.Response, error) {
res, err := http.Get(url)
if err != nil {
return nil, err
}
return res, nil
}

Umlauts in ISO-8859-1 encoded website

My very simple code snippet:
import "net/http"
import "io"
import "os"
func main() {
resp, err := http.Get("http://example.com")
if err == nil {
io.Copy(os.Stdout, resp.Body)
}
}
When example.com is charset=iso-8859-1 encoded my output is faulty. Umlauts for example are not displayed correctly:
Hällo Wörld --> H?llo W?rld
Whats a good solution to display umlauts correctly??
You can use the package golang.org/x/net/html/charset to determine the encoding of the website, and also create a reader that converts the content to UTF-8.
Below is a working example:
package main
import (
"io"
"net/http"
"os"
"golang.org/x/net/html/charset"
)
func main() {
resp, err := http.Get("http://example.com")
if err != nil {
os.Exit(1)
}
r, err := charset.NewReader(resp.Body, resp.Header.Get("Content-Type"))
if err != nil {
os.Exit(1)
}
io.Copy(os.Stdout, r)
}

Limiting bandwidth of http get

I'm a beginner to golang.
Is there any way to limit golang's http.Get() bandwidth usage? I found this: http://godoc.org/code.google.com/p/mxk/go1/flowcontrol, but I'm not sure how to piece the two together. How would I get access to the http Reader?
Thirdparty packages have convenient wrappers. But if you interested in how things work under the hood - it's quite easy.
package main
import (
"io"
"net/http"
"os"
"time"
)
var datachunk int64 = 500 //Bytes
var timelapse time.Duration = 1 //per seconds
func main() {
responce, _ := http.Get("http://google.com")
for range time.Tick(timelapse * time.Second) {
_, err :=io.CopyN(os.Stdout, responce.Body, datachunk)
if err!=nil {break}
}
}
Nothing magic.
There is an updated version of the package on github
You use it by wrapping an io.Reader
Here is a complete example which will show the homepage of Google veeeery sloooowly.
This wrapping an interface to make new functionality is very good Go style, and you'll see a lot of it in your journey into Go.
package main
import (
"io"
"log"
"net/http"
"os"
"github.com/mxk/go-flowrate/flowrate"
)
func main() {
resp, err := http.Get("http://google.com")
if err != nil {
log.Fatalf("Get failed: %v", err)
}
defer resp.Body.Close()
// Limit to 10 bytes per second
wrappedIn := flowrate.NewReader(resp.Body, 10)
// Copy to stdout
_, err = io.Copy(os.Stdout, wrappedIn)
if err != nil {
log.Fatalf("Copy failed: %v", err)
}
}

Golang Error: Undefined: http.NewSingleHostReverseProxy

I tried building a simple program using Golang, here it is:
package main
import (
"net/http"
"net/http/httputil"
"log"
)
func main(){
proxy := http.NewSingleHostReverseProxy( &http.URL{Scheme:"http",Host:"www.google.com",Path:"/"})
err := http.ListenAndServe(":8080", proxy)
if err != nil {
log.Fatal("ListenAndServe: ", err.String())
}
}
Build:
go build myprogram.go
Output:
command-line-arguments
./myprogram.go:5: imported and not used: "net/http/httputil"
./myprogram.go:11: undefined: http.NewSingleHostReverseProxy
./myprogram.go:11: undefined: http.URL
./myprogram.go:15: err.String undefined (type error has no field or method String)
I noticed that http.NewSingleHostReverseProxy is in the "net/http/httputil" package, so why did I see such errors?
Maybe I need a specific command to build it correctly?
EDIT
Afterwords, here is the new working code:
package main
import (
"net/http"
"net/http/httputil"
"net/url"
"log"
)
func main(){
proxy := httputil.NewSingleHostReverseProxy( &url.URL{Scheme:"http",Host:"www.google.com",Path:"/"})
err := http.ListenAndServe(":8080", proxy)
if err != nil {
log.Fatal("ListenAndServe: ", err)
}
}
I added
"net/url"
And replaced http.NewSingleHostReverseProxy with httputil.NewSingleHostReverseProxy, also
http.URL with url.URL.
This is the working code:
package main
import (
"net/http"
"net/http/httputil"
"net/url"
"log"
)
func main(){
proxy := httputil.NewSingleHostReverseProxy( &url.URL{Scheme:"http",Host:"www.google.com",Path:"/"})
err := http.ListenAndServe(":8080", proxy)
if err != nil {
log.Fatal("ListenAndServe: ", err)
}
}
Thanks to raina77ow for help.

Resources