Got HTTPConnectionPool when I request(post) to kserve example model with python SDK - python-requests

I runed code case1, case2 in local space with vscode, not Notebooks of kubeflow censtral dashboard.
I'd like to know what I'm missing.
prepared:
added user
- email: my_name#gmail.com
hash: 1234 # Actually, hash value
userID: "myuserid"
username: myusername
added namespace
$ kubectl apply -f profile.yaml
apiVersion: kubeflow.org/v1beta1
kind: Profile
metadata:
name: exam-namespace
spec:
owner:
kind: User
name: my_name#gmail.com
resourceQuotaSpec: {}
served model
kubeflow Central Dashboard - Models - +NEW MODEL SERVER
apiVersion: "serving.kserve.io/v1beta1"
kind: "InferenceService"
metadata:
annotations:
isdecar.istio.is/inject: "false"
name: "sklearn-iris"
spec:
predictor:
sklearn:
image: "kserve/sklearnserver:v0.9.0"
storageUri: "gs://kfserving-examples/models/sklearn/1.0/model"
what i did with python sdk
case 1.
using kfp.Client()
import kfp
import requests
HOST = "http://localhost:8080"
NAME_SPACE = "exam-namespace"
USER_NAME = "my_name#gmail.com"
USER_PS = '1234'
session = requests.Session()
response = session.get(HOST)
headers = {
"Content-Type": "application/x-www-form-urlencoded",
}
data = {"login": USER_NAME, "password": USER_PS}
session.post(response.url, headers=headers, data=data)
session_cookie = session.cookies.get_dict()["authservice_session"]
# access kubeflow dashboard
client = kfp.Client(
host=f"{HOST}/pipeline",
namespace=f"{NAME_SPACE}",
cookies=f"authservice_session={session_cookie}")
session_cookie = session.cookies.get_dict()
sklear_iris_input = dict(instances = [
[6.8, 2.8, 4.8, 1.4],
[6.0, 3.4, 4.5, 1.6]
])
headers = {'Host': "http://sklearn-iris.project-pipeline.example.com"}
res = session.post(f"{HOST}/v1/models/v1/models/sklearn-iris:predict",
headers = headers,
cookies = session_cookie,
data = json.dumps(sklear_iris_input))
print(f"res.json : {res.json}")
and, got this..
HTTPSConnectionPool(host='127.0.0.1', port=8080):
Max retries exceeded with url: /v1/models/v1/models/sklearn-iris:predict (Caused by SSLError(SSLError(1, '[SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1131)')))
case 2.
using KServeClient()
import requests
import json
from kserve import utils, KServeClient
NAME_SPACE = "exam-namespace"
SERVICE_NAME = "sklearn-iris"
kserve = KServeClient()
isvc_resp = kserve.get(SERVICE_NAME, namespace = NAME_SPACE)
# http://sklearn-iris.exam-namespace.svc.cluster.local/v1/models/sklearn-iris:predict
isvc_url = isvc_resp['status']['address']['url']
sklear_iris_input = dict(instances = [
[6.8, 2.8, 4.8, 1.4],
[6.0, 3.4, 4.5, 1.6]
])
response = requests.post(isvc_url, json = json.dumps(sklear_iris_input))
print(response.text)
and, got this..
HTTPConnectionPool(host='sklearn-iris-test2.project-pipeline.svc.cluster.local', port=80):
Max retries exceeded with url: /v1/models/sklearn-iris-test2:predict (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fe70aa8e9a0>: Failed to establish a new connection: [Errno -2] Name or service not known'))
After adding a new user to dex and creating a namespace via profile.yaml, do I need to add specific information to the AuthorizationPolicy?
I installed kubernetes and kubeflow on this desktop and port-forwarded through istio-ingressgate. And I ran the code case1,case2 in locally, is there any problem with setting host to localhost:8080?
If the above two questions don't have anything I need to do, how should the HTTPConnectionPool be resolved?

Related

Azure Bicep RG Deployment - Object reference not set to an instance of an object [duplicate]

I'm trying to create a simple App Service Plan with the below code.
param Location string = 'eastus'
resource appServicePlan1 'Microsoft.Web/serverfarms#2020-12-01' = {
name: 'myasp'
location: Location
sku: {
name: 'S1'
capacity: 1
}
}
Below is the Azure CLI command that I'm using to execute the above Bicep script
az deployment group create --name deploy1 --resource-group az-devops-eus-dev-rg1 --template-file main.bicep
Below is the screenshot
All this was working earlier. I'm using the latest version of Bicep (v0.9.1) which is available as of today.
Any pointers on why this is occurring now would be much appreciated.
Just had this issue in a MS workshop. We solved it by adding a empty properties-element to the appServicePlan. Ex.
param Location string = 'eastus'
resource appServicePlan1 'Microsoft.Web/serverfarms#2020-12-01' = {
name: 'myasp'
location: Location
properties: {}
sku: {
name: 'S1'
capacity: 1
}
}

Openstack database create instance BadRequestException

I used openstack sdk for create new database instance, but it raise error BadRequestException:
BadRequestException: 400: Client Error for url:
[URL]:8779/v1.0/ab77ed8ae3f744f4baf4fb7bc97848cc/instances,
Resource None cannot be found.
My data:
data: dict = {'name': 'nhinn-db-01',
'nics': [{'net-id': 'd57e7864-5961-4a64-a5ce-005e05d71ccf'}],
'datastore': {'type': 'mysql', 'version': '5.7.31'},
'flavor': object Flavor
}
conn: Connection
conn.database.create_instance(**data)

Accessing the OpenShift's Secrets in the application code of Airflow PythonOperator

I'm using Openshift for the deployment of the Airflow. Airflow consists of two pods Airflow webserver and Airflow DB(Airflow meta-database). I have created secrets that consist of Service account details like username and password. I want to access those values in my data pipeline.
My secrets Deployment config
apiVersion: v1
kind: Secret
metadata:
name :airflow-service-account
namespace: CUSTOM_NAMESPACE
type: Opaque
stringData:
user-name : SERVICE_ACCOUNT_USERNAME_IN_BASE64
service-password: SERVICE_ACCOUNT_PASSWORD_IN_BASE64
I want to access the user-name and service-password in the airflow data-pipeline eg:
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
default_args ={
'owner' : 'airflow',
'start_date': datetime(2022,1,1),
'depends_on_past' : False
}
def api_calls():
''' Want to get the value of user-name and service-password '''
print(user-name)
return f'Able to access the value of user-name and service-password'
def error_check():
return f'Error function called'
dag = DAG("testing_dag", default_args= default_args, schedule_interval= timedelta(days=1))
t1 = PythonOperator(task_id = "test", python_callable= api_calls,dag=dag)
t2 = PythonOperator(task_id = "test_two", python_callable= error_check,dag=dag)
t1>>t2
Any possible way to resolve this issue?
Option 1 - in your code read the file that contains the secret in the place where it's mounted.
https://kubernetes.io/docs/concepts/configuration/secret/#using-secrets-as-files-from-a-pod
Option 2 - perhaps easier - mount the secret as an environment variable, and read that in your code instead.
https://kubernetes.io/docs/concepts/configuration/secret/#using-secrets-as-environment-variables

Symfony 4 enable logging with Monolog's Redis handler

I have a working ELK stack connected to Redis.
I also have a working stateless Symfony 4 application and I want to send all the production logs to my Redis.
I know Monolog has a Redis handler, but I don't know how I'm supposed to tweak the config/prod/monolog.yaml file to accomplish this of if there’s another approach.
This is how it looks right now:
monolog:
handlers:
main:
type: fingers_crossed
action_level: error
handler: nested
excluded_http_codes: [404]
nested:
type: stream
path: "php://stderr"
level: debug
console:
type: console
process_psr_3_messages: false
channels: ["!event", "!doctrine"]
deprecation:
type: stream
path: "php://stderr"
deprecation_filter:
type: filter
handler: deprecation
max_level: info
channels: ["php"]
The approach I took was, first installing the predis client:
composer require predis/predis
Then create a custom service class that extends the RedisHandler class that comes with the Monolog package:
namespace App\Service\Monolog\Handler;
use Monolog\Handler\RedisHandler;
use Monolog\Logger;
use Predis\Client as PredisClient;
class Redis extends RedisHandler
{
public function __construct( $host, $port = 6379, $level = Logger::DEBUG, $bubble = true, $capSize = false)
{
$predis = new PredisClient( "tcp://$host:$port" );
$key = 'logstash';
parent::__construct($predis, $key, $level, $bubble, $capSize);
}
}
Next, activate the service we just created on the services.yml config file:
services:
monolog.handler.redis:
class: App\Service\Monolog\Handler\Redis
arguments: [ '%redis.host%' ]
Be sure the parameter redis.host is set and points to your Redis server. In my case, my parameter value is the IP of my Redis server.
I added other parameters to the class like port and log level. You can set it at the moment of instantiating your service like with the host parameter.
Finally, configure your custom log handler service in your monolog.yaml config file. In my case, I need it only the production logs with the config as follow:
handlers:
custom:
type: service
id: monolog.handler.redis
level: debug
channels: ['!event']

missing method exceptions with Groovy

I am new to groovy. I have a code like this.
String flavor
HashMap config = new HashMap([ ttl: 0, url: url, appName: appName, enable: true ])
client.put("${data}.json", config)
From this client Map I need to iterate the values of appName and enable.
For that I used get method... I am not sure about this.
def values = client.get("${data}.json");
while using this get method am getting following error. Since I am new to groovy i don't know what is happening here
groovy.lang.MissingMethodException: No signature of method: com.comcast.csv.haxor.SecureFirebaseRestClient.get() is applicable for argument types: (org.codehaus.groovy.runtime.GStringImpl) values: [testJson.json]
Possible solutions: get(com.comcast.tvx.megahttp.utils.URL, java.lang.Class), get(java.lang.String, java.lang.Class), grep(), grep(java.lang.Object), getAt(java.lang.String), wait()
not sure what you are trying to do, but (without knowing other details) I'd put your code that way:
Map config = [ ttl: 0, url: url, appName: appName, enable: true ]
client[ "${data}.json" ] = config
def values = client[ "${data}.json" ]
assuming, that you wanted to use getAt() (short-cut with [] ) method instead of get()
Try this:
def config = [ ttl: 0, url: url, appName: appName, enable: true ]
def endpoint = "${data}.json" as String
client.put(endpoint, config)
def values = client.get(endpoint, HashMap)
def appName = values.appName
def enable = values.enable
I couldn't find any info on SecureFirebaseRestClient, so I'm guessing about how it works.

Resources