I am writing REST service using go-json-rest, which inturn using net/http.
My server code is simply, get the request and pass it to a channel
Here is my server code
package main
import (
"github.com/ant0ine/go-json-rest/rest"
"log"
"net/http"
"strconv"
"time"
)
const workerCount = 4
var evChannel = make(chan Event)
var workers = make([]*LogWorker, workerCount)
const maxLogFileSize = 100 // In MB
const maxLogFileBackups = 30
const maxLogFileAge = 5
const logFileName = "/home/sam/tmp/go_logs/event_"
func main() {
// Initialize workers
// Four workers is being created
for i := 0; i < workerCount; i++ {
var fileName = logFileName + strconv.Itoa(i)
workers[i] = NewLogWorker(fileName, maxLogFileSize, maxLogFileBackups, maxLogFileAge)
go workers[i].Work(evChannel)
}
// Initialize REST API
api := rest.NewApi()
//api.Use(rest.DefaultDevStack...)
api.Use(rest.DefaultCommonStack...)
router, err := rest.MakeRouter(
rest.Post("/events", StoreEvents),
)
if err != nil {
log.Fatal(err)
}
api.SetApp(router)
log.Fatal(http.ListenAndServe(":4545", api.MakeHandler()))
}
func StoreEvents(w rest.ResponseWriter, r *rest.Request) {
event := Event{}
err := r.DecodeJsonPayload(&event)
if err != nil {
rest.Error(w, err.Error(), http.StatusInternalServerError)
return
}
// TODO : Add validation if needed
// Add code to parse the request and add further information to event
// log.Println()
select {
case evChannel <- event:
case <- time.After(5 * time.Second):
// throw away the message, so sad
}
// evChannel <- event
//log.Println(Csv(event))
w.WriteHeader(http.StatusOK)
}
When I execute it continuously using jmeter I am occasionally getting the below error
http: Accept error: accept tcp [::]:4545: too many open files; retrying in 10ms
Does net/http open files for every request?
Posting elithrar comment as answer
Sockets, yes. You may need to increase your fd limit (via ulimit or sysctl).
Related
I'm not able to get all the keys from Gin's Context.Header (Golang's gin-gonic http/rest framework) field, even though Header is defined as a Map "type Header map[string][]string" (Header is from net\http\Header.go, request is from net\hhtp\request.go, Context is from Gin-gonic's package), and surprisingly it's strange that even at the compile/build time, Visual Studio code doesn't let me call/use "MapKeys()" method on this Header which is of map type (Given Golang is statically typed language and it knows its data type at compile time already).
I need to copy all the HTTP headers into the Logger, so that when I log any message, I can put the corresponding request Headers.
And also I need to pass all the HTTP Headers from HTTP to gRPC calls for end to end call traceability need.
func (l *Logger) InfoCtx(ctx *gin.Context, md metadata.MD) *zerolog.Event {
headerName := "X-Request-Id" // Read all the headers from the ENV file
// mapping := make(map[string]string)
// mapping[headerName] = ctx.Request.Header[headerName][0]
event := l.Logger.Info()
// ctx.Request.Header ==> Even though this is a "map" type,
// which is known at the compilation time itself,
// it doesn't let me use any map functions.
if ctx != nil && len(ctx.Request.Header[headerName]) > 0 {
event = event.Str(headerName, ctx.Request.Header[headerName][0])
} else if md != nil {
// some other gRPC metadata context handling (not relevant for this question)
}
return event
}
Could you please help?
Header Object
Request object uses Header field
Shows Header is of map type
I may be misunderstanding your issue but I'm able to enumerate the map of request headers from Gin's context:
go.mod:
module github.com/OWNER/stackoverflow/69315290
go 1.16
require github.com/gin-gonic/gin v1.7.4
And main.go:
package main
import (
"log"
"github.com/gin-gonic/gin"
)
func main() {
r := gin.Default()
r.GET("/ping", func(c *gin.Context) {
for k, v := range c.Request.Header {
log.Printf("%s: %v", k, v)
}
c.JSON(200, gin.H{
"message": "pong",
})
})
r.Run()
}
And:
curl --header "dog: freddie" localhost:8080/ping
Yields:
{"message":"pong"}
And:
2021/09/25 10:41:05 User-Agent: [curl/7.68.0]
2021/09/25 10:41:05 Accept: [*/*]
2021/09/25 10:41:05 Dog: [freddie]
[GIN] 2021/09/25 - 10:41:05 | 200 | 408.631µs | 127.0.0.1 | GET "/ping"
The other approach that worked for me meanwhile was,
I created a struct to have "HttpHeadersMap map[string][]string" in it
type CommonContext struct {
HttpHeadersMap map[string][]string
RequestContext context.Context
GrpcMDHeadersMap map[string][]string
}
Assigned Gin's "ctx.Request.Header" to "HttpHeadersMap map[string][]string"
func GetCommonCtx(ctx *gin.Context, md metadata.MD) CommonContext {
var commonContext CommonContext
if ctx != nil {
// event = event.Str(headerName, ctx.Request.Header[headerName][0])
commonContext = CommonContext{ // don't return address, use valye type
HttpHeadersMap: ctx.Request.Header,
RequestContext: ctx.Request.Context(),
}
}
...
}
then inside "gRPC Interceptor (just showing for example use case)", I could use it "HttpHeadersMap" regular way as "headersMapVal.MapKeys()" to iterate over the Map keys.
func clientInterceptor(
ctx context.Context,
method string,
req interface{},
reply interface{},
cc *grpc.ClientConn,
invoker grpc.UnaryInvoker,
opts ...grpc.CallOption,
) error {
start := time.Now()
commonCtx := commonContext.GetCommonCtx(nil, metadata.MD{})
if callOpt, ok := opts[0].(CustomDataCallOption); ok {
headersMapVal := reflect.ValueOf(callOpt).FieldByName("HeadersMap")
newMap := make(map[string]string)
// allKeysMap := make(map[string]string)
for _, key := range headersMapVal.MapKeys() {
// fmt.Printf("headersMapVal.MapKeys(), e %v", e)
// c_key := e.Convert(headersMapValueIndirectStr.Type().Key())
keyValue := headersMapVal.MapIndex(key)
...
...
}
My code:
func getSourceUrl(url string) (string, error) {
resp, err := http.Get(url)
if err != nil {
fmt.Println("Error getSourceUrl: ")
return "", err
}
defer resp.Body.Close()
body := resp.Body
// time = 0
sourcePage, err := ioutil.ReadAll(body)
// time > 5 minutes
return string(sourcePage), err
}
I have a website link with a source of around> 100000 lines. Using ioutil.ReadAll made me get very long (about> 5 minutes for 1 link). Is there a way to get Source website faster? Thank you!
#Minato try this code, play with M throttling parameter. Play with it if you get too errors (reduce it).
package main
import (
"fmt"
"io"
"io/ioutil"
"log"
"net/http"
"runtime"
"time"
)
// Token is an empty struct for signalling
type Token struct{}
// N files to get
var N = 301 // at the source 00000 - 00300
// M max go routines
var M = runtime.NumCPU() * 16
// Throttle to max M go routines
var Throttle = make(chan Token, M)
// DoneStatus is used to signal end of
type DoneStatus struct {
length int
sequence string
duration float64
err error
}
// ExitOK is simple exit counter
var ExitOK = make(chan DoneStatus)
// TotalBytes read
var TotalBytes = 0
// TotalErrors captured
var TotalErrors = 0
// URLTempl is templte for URL construction
var URLTempl = "https://virusshare.com/hashes/VirusShare_%05d.md5"
func close(c io.Closer) {
err := c.Close()
if err != nil {
log.Fatal(err)
}
}
func main() {
log.Printf("start main. M=%d\n", M)
startTime := time.Now()
for i := 0; i < N; i++ {
go func(idx int) {
// slow ramp up fire getData after i seconds
time.Sleep(time.Duration(i) * time.Second)
url := fmt.Sprintf(URLTempl, idx)
_, _ = getData(url) // errors captured as data
}(i)
}
// Count N byte count signals
for i := 0; i < N; i++ {
status := <-ExitOK
TotalBytes += status.length
if status.err != nil {
TotalErrors++
log.Printf("[%d] : %v\n", i, status.err)
continue
}
log.Printf("[%d] file %s, %.1f MByte, %.1f min, %.1f KByte/sec\n",
i, status.sequence,
float64(status.length)/(1024*1024),
status.duration/60,
float64(status.length)/(1024)/status.duration)
}
// totals
duration := time.Since(startTime).Seconds()
log.Printf("Totals: %.1f MByte, %.1f min, %.1f KByte/sec\n",
float64(TotalBytes)/(1024*1024),
duration/60,
float64(TotalBytes)/(1024)/duration)
// using fatal to verify only one go routine is running at the end
log.Fatalf("TotalErrors: %d\n", TotalErrors)
}
func getData(url string) (data []byte, err error) {
var startTime time.Time
defer func() {
// release token
<-Throttle
// signal end of go routine, with some status info
ExitOK <- DoneStatus{
len(data),
url[41:46],
time.Since(startTime).Seconds(),
err,
}
}()
// acquire one of M tokens
Throttle <- Token{}
log.Printf("Started file: %s\n", url[41:46])
startTime = time.Now()
resp, err := http.Get(url)
if err != nil {
return
}
defer close(resp.Body)
data, err = ioutil.ReadAll(resp.Body)
if err != nil {
return
}
return
}
Per transfer variation is about 10-40KByte/sec and final total for all 301 files I get 928MB, 11.1min at 1425 KByte/sec. I believe you should be able to get similar results.
// outside the scope of the question but maybe useful
Also give this a try http://www.dslreports.com/speedtest/ go to settings and select bunch of US servers for testing and set duration to 60sec. This will tell you what your actual effective total rate is to US.
Good luck!
You could iterate sections of the response at a time, something like;
responseSection := make([]byte, 128)
body.Read(responseSection)
return string(responseSection), err
Which would read 128 bytes at a time. However would suggest confirming the download speed is not causing the slow load.
The 5 minutes is probably network time.
That said, you generally would not want to buffer enormous objects in memory.
resp.Body is a Reader.
So you cold use io.Copy to copy its contents into a file.
Converting sourcePage into a string is a bad idea as it forces another allocation.
I set the value in the structure, but it is not set. The methods are called consequently, not in parallel. How can that be? This is golang, forgot to say.
If I change the code to set value in the "start" method (instead of "init" method), it works; but setting value in "init" method fails. Looks very strange to me.
package main
import (
"log"
"net/http"
"time"
)
type tServer struct {
ipAddress string
port string
server http.Server
}
var server tServer
func main() {
server.ipAddress = "0.0.0.0"
server.port = "12345"
server.init()
server.start()
time.Sleep(time.Second * 5)
}
func (srv tServer) init() {
srv.server.Addr = srv.ipAddress + ":" + srv.port
log.Println("srv.server.Addr=", srv.server.Addr) ////////////////////
}
func (srv tServer) start() {
log.Println("srv.server.Addr=", srv.server.Addr) ////////////////////
go srv.startServerRoutine()
}
func (srv tServer) startServerRoutine() {
log.Println("Server started at", srv.server.Addr) //
err := srv.server.ListenAndServe()
if err != nil {
log.Println("Server Error:", err) //
return
}
}
Here is the console:
2017/04/18 19:43:07 srv.server.Addr= 0.0.0.0:12345
2017/04/18 19:43:07 srv.server.Addr=
2017/04/18 19:43:07 Server started at
2017/04/18 19:43:07 Server Error: listen tcp :80: bind: permission denied
This is due to prototype of methods:
func (srv tServer) init()
// ^^^ copies values
so:
server.init() // updates its own copy of server,
// copy gets disposed after init() returns
server.start() // uses its own copy of server
You need to pass srv by pointer to methods:
func (srv *tServer) init()
in that case both init() and start() calls will work on the same copy of tServer structure and will share values in its fields.
If you look at the docs for http.Server you will see that a blank value is possible, and means ":http":
type Server struct {
Addr string // TCP address to listen on, ":http" if empty
You set srv.server.Addr directly in init(), but this is not the proper way to use the http.Server type.
Perhaps you meant to do this:
func (srv tServer) startServerRoutine() {
log.Println("Server started at", srv.ipAddress + ":" + srv.port) //
err := srv.server.ListenAndServe(srv.ipAddress + ":" + srv.port)
if err != nil {
log.Println("Server Error:", err) //
return
}
}
I want to call FormValue on golang net/http Request from several middleware handler functions before serving the request. And I do not want to invalidate the request while doing this.
It works fine except when the incoming request has a multipart form-data, the data gets invalidated after I call FormValue and there is nothing to parse in the final route.
I wrote a utility function that solved my problem:
package utils
import (
"bytes"
"io"
"io/ioutil"
"mime"
"mime/multipart"
"net/http"
"strings"
)
// Get form values without invalidating the request body in case the data is multiform
func GetFormValues(request *http.Request, keys []string) []string {
var values []string
mediaType, params, err := mime.ParseMediaType(request.Header.Get("Content-Type"))
if err != nil || !strings.HasPrefix(mediaType, "multipart/") {
for i := range keys {
values = append(values, request.FormValue(keys[i]))
}
} else { // multi form
buf, _ := ioutil.ReadAll(request.Body)
origBody := ioutil.NopCloser(bytes.NewBuffer(buf))
var rdr = multipart.NewReader(bytes.NewBuffer(buf), params["boundary"])
for len(values) < len(keys) {
part, err_part := rdr.NextPart()
if err_part == io.EOF {
break
}
for i := range keys {
if part.FormName() == keys[i] {
buf := new(bytes.Buffer)
buf.ReadFrom(part)
values = append(values, buf.String())
}
}
}
request.Body = origBody
}
if len(values) == len(keys) {
return values
} else {
return nil
}
}
// Get form value without invalidating the request body in case the data is multiform
func GetFormValue(request *http.Request, key string) string {
if result := GetFormValues(request, []string{key}); len(result) == 1 {
return result[0]
} else {
return ""
}
}
Now instead of calling
value := request.FormValue(key)
I do
value := utils.GetFormValue(request, key)
or for multiple values
values := utils.GetFormValues(request, []string{keys...})
As an exercise I created a small HTTP server that generates random game mechanics, similar to this one. I wrote it on a Windows 7 (32-bit) system and it works flawlessly. However, when I run it on my home machine, Windows 7 (64-bit), it always fails with the same message: exit status -1073741819. I haven't managed to find anything on the web which references that status code, so I don't know how important it is.
Here's code for the server, with redundancy abridged:
package main
import (
"fmt"
"math/rand"
"time"
"net/http"
"html/template"
)
// Info about a game mechanic
type MechanicInfo struct { Name, Desc string }
// Print a mechanic as a string
func (m MechanicInfo) String() string {
return fmt.Sprintf("%s: %s", m.Name, m.Desc)
}
// A possible game mechanic
var (
UnkillableObjects = &MechanicInfo{"Avoiding Unkillable Objects",
"There are objects that the player cannot touch. These are different from normal enemies because they cannot be destroyed or moved."}
//...
Race = &MechanicInfo{"Race",
"The player must reach a place before the opponent does. Like \"Timed\" except the enemy as a \"timer\" can be slowed down by the player's actions, or there may be multiple enemies being raced against."}
)
// Slice containing all game mechanics
var GameMechanics []*MechanicInfo
// Pseudorandom number generator
var prng *rand.Rand
// Get a random mechanic
func RandMechanic() *MechanicInfo {
i := prng.Intn(len(GameMechanics))
return GameMechanics[i]
}
// Initialize the package
func init() {
prng = rand.New(rand.NewSource(time.Now().Unix()))
GameMechanics = make([]*MechanicInfo, 34)
GameMechanics[0] = UnkillableObjects
//...
GameMechanics[33] = Race
}
// serving
var index = template.Must(template.ParseFiles(
"templates/_base.html",
"templates/index.html",
))
func randMechHandler(w http.ResponseWriter, req *http.Request) {
mechanics := [3]*MechanicInfo{RandMechanic(), RandMechanic(), RandMechanic()}
if err := index.Execute(w, mechanics); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
}
}
func main() {
http.HandleFunc("/", randMechHandler)
if err := http.ListenAndServe(":80", nil); err != nil {
panic(err)
}
}
In addition, the unabridged code, the _base.html template, and the index.html template.
What could be causing this issue? Is there a process for debugging a cryptic exit status like this?
When I ran it, I got the following two errors:
template: content:6: nil pointer evaluating *main.MechanicInfo.Name
http: multiple response.WriteHeader calls
The former was in the web browser, the latter in the console window where I launched your server.
The nil pointer problem is because your abridged program leaves GameMechanics[1:32] set to nil.
The second error is interesting. The only place in your program that any methods on your http.ResponseWriter get called is inside of index.Execute, which is not your code -- meaning maybe there is something wrong happening in html/template. I'm testing this with Go 1.0.2.
I put _base.html at the top of index.html and then changed index to this:
var index = template.Must(template.ParseFiles("templates/index.html"))
and the http.WriteHeaders warning went away.
Not really an answer, but a direction you could explore.
As a bonus, here's the more "Go way" of writing your program. Note that I simplified the use of the PRNG (you don't need to instantiate unless you want several going in parallel) and simplified the structure initializer:
package main
import (
"fmt"
"html/template"
"math/rand"
"net/http"
)
// Info about a game mechanic
type MechanicInfo struct{ Name, Desc string }
// Print a mechanic as a string
func (m MechanicInfo) String() string {
return fmt.Sprintf("%s: %s", m.Name, m.Desc)
}
// The game mechanics
var GameMechanics = [...]*MechanicInfo{
{"Avoiding Unkillable Objects",
"There are objects that the player cannot touch. These are different from normal enemies because they cannot be destroyed or moved."},
{"Race",
"The player must reach a place before the opponent does. Like \"Timed\" except the enemy as a \"timer\" can be slowed down by the player's actions, or there may be multiple enemies being raced against."},
}
// Get a random mechanic
func RandMechanic() *MechanicInfo {
i := rand.Intn(len(GameMechanics))
return GameMechanics[i]
}
var index = template.Must(template.ParseFiles("templates/index.html"))
func randMechHandler(w http.ResponseWriter, req *http.Request) {
mechanics := [3]*MechanicInfo{RandMechanic(), RandMechanic(), RandMechanic()}
if err := index.Execute(w, mechanics); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
}
}
func main() {
http.HandleFunc("/", randMechHandler)
if err := http.ListenAndServe(":80", nil); err != nil {
panic(err)
}
}