Where are the texts of walking NPCs stored? - azerothcore

Where are the phrases of walking-npc stored? I've found their waypoints in the waypoint_data table, but there is no text for points... And I've found their phrases at creature_text and at broadcast_text. But I can't find which text for which waypoint. How to find it?
It would also be great to find out the dependence of fields in the tables for adding or correcting localizations, if the text is not inscribed in the Core.

Texts for waypoints are not stored anywhere because that's not how it works.
When a scripted creature says a text, this means that when that creature reaches a waypoint with a certain ID, It triggers an event that makes him/her/it say a line from creature_text.
To make a creature talk when he/she/it reaches a waypoint you will need either a core script or a Smart AI script for that creature. The latest can be found in smart_scripts.
If you want to know more about these creature texts you can find Trinity core's documentation about it since our cores are very alike in terms of database structure:
https://trinitycore.atlassian.net/wiki/spaces/tc/pages/2130007/creature+text

Related

How to determine why a word was included in description from vision api

I used the computer vision api on an image. The word pizza was returned in describing the image and the only connection to pizza I can make is a pizza company logo on a napkin. The word birthday was also returned. Is there any way to figure out if the word pizza was returned because of the company logo, or it was a guess associated with the word birthday?
This depends on how much details the API gives you back. If it allows you to observe the intermediate outputs of the classifier that is used to categorize the image, you can see which parts of the image that results in high output values. The pizza company logo on a napkin, depending on how large it appears, is quite likely to cause this.
If you are using a more open API and a classifer, like keras and the networks provided under keras.applications, you can use what are called "class activation maps" to see which parts of the image causes the result.
If you find the above too had to do, one easy way to investigate the reason is to crop parts of the image using a loop and pass them to the API. I suspect that "birthday" might be related to a distributed feature and you might not be able to find where that comes from, whereas pizza might be from the logo or some other part of the image.

Totally offline app-SQlite-show POIS

Can i use osmdroid to save my offline maps in a sqlite db and after show offline some POIS and get distances beetween my location and the location of the POIS?
From a research i figure out that osmdroid not support offline searching for POIS. Is that true?
Its better to search for the lip Mapforge?
With osmdroid, you can
View downloaded, prerendered, map tiles offline via database or several other mechanisms
Plot icons, lines, polygons, etc and attach on click handers for each item
Show your location on the map
So that said, you can get the point for a specific item on the map. If you can get your location, then it's a simple equation to calculate straight line distance.
If you want the points to be offline and searchable, you'll need a database of some sort, a way to populate it and some strategy to search it. This is pretty much what mapsforge is doing. I've done similar things with wikimapia data and it can definitely be done, but there's nothing provided out of the box since osmdroid only handles raster images.
If online is an option, osmbonuspack provides a number of drivers to search several online resources for POIs, including turn by turn directions.

GraceNote - generate playlist with music of a given country

I would like to use GraceNote to generate play-lists which contain songs likely to appeal to, or, at least, be known to, residents of a given country. E.G, Japan, Korea, Turkey, Brazil, France ...
They don't necessarily have to be in the local language, as I don't think that I can do that with GraceNote (can I ?), but local artists would be nice. Is there any way, for instance, to query and generate a playlist using artist origin?
I realize that something like Gangnam Style might be known in most countries ;-) and that play-list generation is inexact when used this way, but I would be happy with a 70 or 80% "I know that song" reaction.
Can it be done? If so, how? #cweichen, can you help?
It seems likely you are referring the the Rhythm API. As you probably can see from the function definition, you cannot create a playlist using 'ARTIST_ORIGIN'.
The closest thing I can think of is creating a playlist (aka radio station) using on a popular song in the given country as a seed.
You may try configuring the 'focus_similarity' value to get a wider variety of songs. This is just a suggestion and I am not sure if this will get you what your looking for.
*Pygn currently does not support 'focus_similarity' configuration but it should not be too difficult to add yourself.

Biped arduino robot that can navigate in the house

I will create a biped robot soon, i will add speech recognition and stuff to it.
I want it to find in my house. Is it possible to create like a map or something where i mark the
places in the house with numbers or something and then make the arduino robot
read it, so ex. when i say: "Go to your room" (the arduinos room = my room) it will go
to it's room (my room) automaticly.
UPDATE:
Is there a gps module or something that i can modify like i want so my robot
can find in my house? So i can mark where it can go and where the rooms are
and so i can program it to go to ex. my room when i say so, and it will find to my room.
There are many different ways of attacking this scenario. If you are talking a map it might be worth generating a measurement system whereby you can use an array of units to allow your robot to navigate the house. I think you will run into issues over time with unexpected variance so the large part if navigation code would have to tackle calibrating against a known map.
The advantage of using this method is that you could have it "learn" a new space by mapping a new array against your units

Categories of tags

I'm starting a pro bono project that is the web interface to the world's largest collection of lute music and it's a challenging collection from several points of view. The pieces are largely from 1400 to 1600, but they range from the mid-1200's to present day. Needless to say, there is tremendous variability in how the pieces are categorized and who they are attributed to. It is obvious that any sort of rigid, DB-enforced hierarchy isn't going to work with this collection, so my thoughts turn to tags.
But not all tags are the same. I'll have tags that represent a person/role (composer, translator, entabulator, etc.), tags that represent the instrument(s) the piece in written for, and tags that represent how the piece has been classified by any one of half a dozen different classification systems used over the centuries.
We will be using a semi-controlled tag vocabulary to prevent runaway tag proliferation (e.g. del.icio.us), but I want to treat the tags as belonging to different groups. People tags should not be offered when the editor is doing instrument tagging, etc.
Has anyone done something like this? I have several ways I can think of to do it, but if there is an existing system that is well-done it would save me time implementing/debugging.
FWIW: This is a Django system and I'm looking at starting with Django-tagging and then hacking from there, possibly adding a category field or ...
There's an issue #14 for django-tagging filed back in 2007 which is trying to address this problem. Don't know whether developers are planning to add this feature or not.
However, there's a machinetags branch of django-tagging mantained by Gregor Müllegger here at https://code.launchpad.net/~gregor-muellegger/django-tagging/machinetags/. It allows to assign tags namespaces (and/or values), and facilitate querying tags by namespace / value. So you'd be able to tag a piece with instrument:<instrument_name> or instrument=<instrument_name>, for example.
It's mostly in sync with the django-tagging trunk (the latest commit is there's a number of commits missing though). I remember myself working on some project using that branch about a year ago; it worked fine. Read the documentation on branch and comments on the issue for more details.

Resources