I've got the following specs wrapped around a controller. I've got the following Jasmine Spec:
describe 'MyApp.Controller.SomeController', ->
beforeEach module('mymodule')
beforeEach inject ($rootScope , $httpBackend, $controller, SomeService) ->
#scope = $rootScope.$new()
#httpBackend = $httpBackend
someService = SomeService
#someResource = someService.someResource
$controller 'MyApp.Controller.SomeController', $scope: #scope
describe "#fetchMethod", ->
describe "given an object", ->
beforeEach ->
#id = 17
#scope.fetchMethod(#id)
it "sets due to true", ->
#httpBackend.whenGET().respond(200, {"someStrings": ["foo", "bar"], otherStrings: ["bar", "goo"]})
expect(#scope.someStrings).toBeDefined()
expect(#scope.otherStrings).toBeDefined()
Wrapped around the following Controller:
MyApp.Controller.SomeController = (scope, someService) ->
scope.fetchMethod = (ID)->
someService.someResource.fetch
Id: artID
,(response) ->
scope.someStrings = response['someStrings']
scope.otherStrings = response['otherStrings']
scope.someStringsExist = true if scope.someStrings
MyApp.Controller.SomeController.$inject = ['$scope', 'SomeService']
Where SomeService is defined as follows:
MyApp.Service.SomeService = (resource) ->
#someResource = resource '/api/foos/:ID', {},
fetch:
method: 'GET'
#
MyApp.myModule.service 'SomeService', ['$resource', MyApp.Service.SomeService]
This setup appears to function on the site, correctly executing the request and returning values from the (rails) api endpoint.
However, when the jasmine specs are run it fails with:
Error: Unexpected request: GET /api/foos/17 No more request expected in http://localhost:3000/assets/helpers/angular-mocks.js?body=1 (line 889)
What am I missing? Why is httpBackend failing to recognize the GET request?
scope.initialize = (artifactId, artifactType)->
scope.artifactId = artifactId
scope.artifactType = artifactType
scope.currentVersionExists = false
scope.attachmentFetcher(scope.artifactId)
MyApp.Controller.SomeController.$inject = ['$scope', 'SomeService']
This line where you stub the response should go before you make the request:
#httpBackend.whenGET().respond(200, {"someStrings": ["foo", "bar"], otherStrings: ["bar", "goo"]})
Related
I'm writing my second project on FastAPI. And I got this error.
For example I have this code in my routers.users.py:
#router.get('/', response_model=Page[Users])
async def get_all_users(db: Session = Depends(get_db)):
return paginate(db.query(models.User).order_by(models.User.id))
And it works. It has fields limit and page in swagger documentation.
I tried to write the same for routers.recipes.py, but in this case I have no fields for pagination(limit, page) in swagger. Ok, I googled and found out that adding dependencies could help me. And now I see pagination parameters in swagger, but error is still the same.
routers.recipes:
#router.get('/', response_model=Page[PostRecipes], dependencies=[Depends(Params)])
async def get_all_recipes(db: Session = Depends(get_db)):
return paginate(db.query(models.Recipe).order_by(models.Recipe.id))
pagination:
class Params(BaseModel, AbstractParams):
page: int = Query(1, ge=1, description="Page number")
limit: int = Query(50, ge=1, le=100, description="Page size")
def to_raw_params(self) -> RawParams:
return RawParams(
limit=self.limit,
offset=self.limit * (self.page - 1),
)
class Page(BasePage[T], Generic[T]):
page: conint(ge=1) # type: ignore
limit: conint(ge=1) # type: ignore
__params_type__ = Params
#classmethod
def create(
cls,
items: Sequence[T],
total: int,
params: AbstractParams,
) -> Page[T]:
if not isinstance(params, Params):
raise ValueError("Page should be used with Params")
return cls(
total=total,
items=items,
page=params.page,
limit=params.limit,
)
__all__ = [
"Params",
"Page",
]
So, does anyone have ideas about it?
according to doc you have to specify default parameters,
your code should look like paginate(iterable, params)
I have used my own custom webhook using ask-sdk and is deployed in my ec2 instance. Now I want to use DynamoDB as DynamoDbPersistenceAdapter
but I am not getting any reference how to do that.
DynamoDbPersistenceAdapter will need AWS Keys and table name and some details for dynamo db but where to initialize? I found some code, but this dont have anything :
persistenceAdapter = new DynamoDbPersistenceAdapter({
tableName: 'global_attr_table',
createTable: true,
partitionKeyGenerator: keyGenerator
});
This can probably be solved by adding environmental variables and by setting up an AWS CLI profile:
Heres how you setup an AWS CLI Profile:
https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html
Once you have a profile setup with your AWS access information you can export Environmental Variables in your command line or in a shell script
$> export AWS_PROFILE=YourNewAWSCLIProfileName
$> export AWS_REGION=us-east-1
$> export AWS_DEFAULT_REGION=us-east-1
and you can check that these variables are set by typing
$> echo $AWS_PROFILE
$> echo $AWS_REGION
$> echo $AWS_DEFAULT_REGION
This is what I use. If for some reason that doesnt work here is some research into how you might add a DynamoDB Client:
Trying to solve a different problem so let me solve yours as I walk through mine:
In: node_modules/ask-sdk/dist/skill/factory/StandardSkillFactory.js
there is reference to something similar to what you have above
new ask_sdk_dynamodb_persistence_adapter_1.DynamoDbPersistenceAdapter({
tableName: thisTableName,
createTable: thisAutoCreateTable,
partitionKeyGenerator: thisPartitionKeyGenerator,
dynamoDBClient: thisDynamoDbClient,
})
I believe you need to create a DynamoDbClient instance which I found referenced here in the AWS SDK Docs.
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/dynamodb-example-document-client.html
You'd have to set your own service:
In: node_modules/aws-sdk/lib/dynamodb/document_client.js
/**
* Creates a DynamoDB document client with a set of configuration options.
*
* #option options params [map] An optional map of parameters to bind to every
* request sent by this service object.
* #option options service [AWS.DynamoDB] An optional pre-configured instance
* of the AWS.DynamoDB service object to use for requests. The object may
* bound parameters used by the document client.
* #option options convertEmptyValues [Boolean] set to true if you would like
* the document client to convert empty values (0-length strings, binary
* buffers, and sets) to be converted to NULL types when persisting to
* DynamoDB.
* #see AWS.DynamoDB.constructor
*
*/
constructor: function DocumentClient(options) {
var self = this;
self.options = options || {};
self.configure(self.options);
},
/**
* #api private
*/
configure: function configure(options) {
var self = this;
self.service = options.service;
self.bindServiceObject(options);
self.attrValue = options.attrValue =
self.service.api.operations.putItem.input.members.Item.value.shape;
},
/**
* #api private
*/
bindServiceObject: function bindServiceObject(options) {
var self = this;
options = options || {};
if (!self.service) {
self.service = new AWS.DynamoDB(options);
} else {
var config = AWS.util.copy(self.service.config);
self.service = new self.service.constructor.__super__(config);
self.service.config.params =
AWS.util.merge(self.service.config.params || {}, options.params);
}
},
I'm not sure what those options might look like.
I am new to groovy. I have a code like this.
String flavor
HashMap config = new HashMap([ ttl: 0, url: url, appName: appName, enable: true ])
client.put("${data}.json", config)
From this client Map I need to iterate the values of appName and enable.
For that I used get method... I am not sure about this.
def values = client.get("${data}.json");
while using this get method am getting following error. Since I am new to groovy i don't know what is happening here
groovy.lang.MissingMethodException: No signature of method: com.comcast.csv.haxor.SecureFirebaseRestClient.get() is applicable for argument types: (org.codehaus.groovy.runtime.GStringImpl) values: [testJson.json]
Possible solutions: get(com.comcast.tvx.megahttp.utils.URL, java.lang.Class), get(java.lang.String, java.lang.Class), grep(), grep(java.lang.Object), getAt(java.lang.String), wait()
not sure what you are trying to do, but (without knowing other details) I'd put your code that way:
Map config = [ ttl: 0, url: url, appName: appName, enable: true ]
client[ "${data}.json" ] = config
def values = client[ "${data}.json" ]
assuming, that you wanted to use getAt() (short-cut with [] ) method instead of get()
Try this:
def config = [ ttl: 0, url: url, appName: appName, enable: true ]
def endpoint = "${data}.json" as String
client.put(endpoint, config)
def values = client.get(endpoint, HashMap)
def appName = values.appName
def enable = values.enable
I couldn't find any info on SecureFirebaseRestClient, so I'm guessing about how it works.
I am trying to find out how to write a server-unit-test with jasmine.
This is what I have so far:
/both/posts.coffee
#Posts = new Mongo.Collection('posts');
class #Post extends Minimongoid
#_collection: #Posts
#defaults:
title: ''
validate: ->
unless #title.length > 5
#error('title', 'Title is required and should be longer than 5 letters.')
/tests/server/unit/posts/spec/postSpec.coffee
describe 'Post', ->
post = undefined
beforeEach ->
post = new Post()
describe 'fields', ->
it 'should be able to assign title with strings', ->
title = "The Title"
post.title = title
expect(post.title).toBe title
server console:
(STDERR) [sanjo:jasmine]: The code has syntax errors. [ReferenceError: Minimongoid is not defined]
What is wrong there? How can I get this simple test passed?
I've got it working when I moved the whole content of the UNIT-Test to the INTEGRATION-Test folder and prepended the code with
/tests/server/integration/posts/spec/postSpec.coffee
Jasmine.onTest ->
# my code
Now is all green. Thank you #Marius Darila
I have two different data sources for which I need two different PersistenceManagerFactory. This I can always get by writing a persistence.xml file. But I want this to be represented programmatically. Though the second data-source remains relatively unchanged, the first data-source may have additions to it via plugins. These plugins can come with one or more JDO annotated classes. A persitance.xml wouldn't be such a good idea here because I want them to be loaded at runtime.
In Hibernate (and with JPA) this would be possible by creating a configuration object and adding all annotated classes to it. Whenever I see a new plugin being loaded, I can always shutdown the SessionFactory and reload it with the extra classes from plugin by looking at #Entity annotation.
Is there a similar way to do it in DataNucleus/JDO?
I tried searching it on Google, but all I end up is at DataNucleus site explaining how to write a persitence.xml file.
The code below in Scala demonstrates how you can create a PersistenceManager dynamically. You just have to populate a map and pass it to JDOHelper#getPersistenceManagerFactory.
private val pu = props(db.driver, db.url, db.username, db.password)
private val pmf = JDOHelper.getPersistenceManagerFactory(pu.asJava)
private val pm = pmf.getPersistenceManager.asInstanceOf[JDOPersistenceManager]
Below you can see an example on how you will probably be willing to populate such map in the case of H2, MongoDB and PostgreSQL:
def props(driver: String, url: String, username: String, password: String): Map[String, Any] =
driver match {
case "org.h2.Driver" =>
Map[String, Any](
"javax.jdo.option.Mapping" -> "h2",
"datanucleus.schema.autoCreateAll" -> "true",
"javax.jdo.PersistenceManagerFactoryClass" -> "org.datanucleus.api.jdo.JDOPersistenceManagerFactory",
"javax.jdo.option.ConnectionDriverName" -> driver,
"javax.jdo.option.ConnectionURL" -> url,
"javax.jdo.option.ConnectionUserName" -> username,
"javax.jdo.option.ConnectionPassword" -> password
)
case "org.postgresql.Driver" =>
Map[String, Any](
"datanucleus.schema.autoCreateAll" -> "true",
"javax.jdo.PersistenceManagerFactoryClass" -> "org.datanucleus.api.jdo.JDOPersistenceManagerFactory",
"javax.jdo.option.ConnectionDriverName" -> driver,
"javax.jdo.option.ConnectionURL" -> url,
"javax.jdo.option.ConnectionUserName" -> username,
"javax.jdo.option.ConnectionPassword" -> password,
"javax.jdo.option.RetainValues" -> "true",
"javax.jdo.option.RestoreValues" -> "true",
"javax.jdo.option.Optimistic" -> "true",
"javax.jdo.option.NontransactionalWrite" -> "false",
"javax.jdo.option.NontransactionalRead" -> "true",
"javax.jdo.option.Multithreaded" -> "true",
"javax.jdo.option.IgnoreCache" -> "false"
)
case "mongodb.jdbc.MongoDriver" =>
Map[String, Any](
"javax.jdo.option.Mapping" -> "mongo",
"datanucleus.schema.autoCreateAll" -> "true",
"javax.jdo.PersistenceManagerFactoryClass" -> "org.datanucleus.api.jdo.JDOPersistenceManagerFactory",
"javax.jdo.option.ConnectionDriverName" -> driver,
"javax.jdo.option.ConnectionURL" -> url,
"javax.jdo.option.ConnectionUserName" -> username,
"javax.jdo.option.ConnectionPassword" -> password
)
case _ => throw new IllegalArgumentException(s"unknown driver %s".format(driver))
}