In Flex columnchart, the height depends on the value, when 2 values have much difference.
the smaller value is not very clearly shown on the axis. Is it possible to define the minimum height of column to show, so that even a very small value can be seen?
Typically, in any charting library you'll want to do this by controlling the vertical axis. For example, consider the following data
Foo | Bar | Baz
0.7 | 30 | 80
If you were to chart this and let flex automatically calculate the vertical axis and it chooses for the vertical axis to go from 0.7 to 80 then Foo will barely show up.
However, if you were allowed to specify the vertical axis then you could programatically choose good axis values. For example, let maximumValue be the (previously calculated) maximum value of your data and let minimumValue be the (previously calculated) minimum value of your data. Then you can set your axis min and max as follows...
axisMinimum = minimumValue - ((maximumValue - minimumValue) * 0.2)
axisMaximum = maximumValue + ((maximumValue - minimumValue) * 0.2)
This would ensure that the smallest value in your chart appears at the 20% (0.2) position in your chart and the maximum value appears at the 80% (1-0.2) position of your chart. You can play with the multipliers to get a chart that looks good to you.
The only disadvantage you'll find is that when charts are very close in value then this will make them seem even closer.
Related
I am trying to perform my own logic for hit testing and have dynamic Y Axis that exist within my lightningchart JS charts. At the moment I am off by the amount of pixels that the axis is taking up, but have not found a way to determine this value.
Axis documentation shows the use of Axis.getHeight() which returns the height in pixels for the X Axis.
Is there a way to set/read the width of Y Axis in LightningCharts JS?
Thanks
Edit: As requested for more information.
I am using pointer down/move/end/out to detect finger/pointer/mouse position over the charts.
This particular chart presents for example the following data
[x: 0, y: 20]
[x: 3600, y: 21]
[x: 86400, y: 19]
Where x is time in seconds and y is temperature in Celsius.
This will be represented as a lineseries for visual but also a point series for interaction. Users will be able to drag the points up/down based on an interval of say 0.5C and left/right based on a time interval of say 600 (5 minutes).
On tablet, this interaction must be performed by first tapping on the point (to activate it, and present a tooltip) and then dragging the point. This is to prevent conflict with panning/zooming the chart.
All of the above has been developed and is working for the exception of when the YAxes are visible and impact the chart spacing.
I need a way to calculate very specifically the width of all yAxes collectively to support the manual hit testing logic.
Could you elaborate what kind of hit testing you are doing? It is possible that there is a better way to do it than something based on Axis height, hence I am asking.
The getHeight() method is available for both X and Y axis. To use it for Y axis, just do chart.getDefaultAxisY().getHeight(). However, this method is a bit unreliable - the result can be a frame behind. For example, if you use it immediately when the chart is created, it might return a wrong value. A possible workaround is to use it after a timeout.
Another way to know for sure the width of Y axis is to explicitly configure it yourself with Axis.setThickness(100). This would make the Axis 100 pixels wide always.
IO've realized the question was poorly written and decided to rewrite it:
I'm making a bar is supposed to be synced to a timer so that as it gets closer to 0, the bar will fill more until when it hits 0, the bar is at 100%
So instead of filling more as the value for the percentage grows, I'm wanting the value to be smaller and the bar fill more.
The math is fairly simple. For example say you want to represent the progress bar at 75% or just 75. We will represent percentage as a number from 0 to 100. The math is:
(percentage / 100) * totalWidthOfProgressBar
So if we want 75% of the bar full and the width of the progress bar was 100 units wide then then the math would be:
(75 / 100) * 100 = 75 units
If the progress bar's width was 200 units then the math would be:
(75 / 100) * 200 = 150 units
And 150 would be 75% of 200
This is one way but I am sure there are more ways to calculate this. Some progressBar objects have functions that will automatically calculate this for you. Take a look at the documentation of the progress bar you are using. Good luck and happy coding.
Normally, the longest bar spans to the right border.
I want it like you see on the picture.
I've achieved that by computing the max value and setting xaxis: { max: maxValue * 1.1 }. Again, a bit hacky.
I have tried - without success:
grid: {
margin: 30,
minBorderMargin: 10,
},
You can add the autoscaleMargin property to your xaxis options (as long as you aren't setting a min or max value for your xaxis):
xaxis: {
autoscaleMargin: .02
}
From the Flot API documentation:
The "autoscaleMargin" is a bit esoteric: it's the fraction of margin that the scaling algorithm will add to avoid that the outermost points ends up on the grid border. Note that this margin is only applied when a min or max value is not explicitly set. If a margin is specified, the plot will furthermore extend the axis end-point to the nearest whole tick. The default value is "null" for the x axes and 0.02 for y axes which seems appropriate for most cases.
This JSFiddle shows an example of using the autoscaleMargin bump the grid border away from the longest bar value.
I want set a minimum value for a Flot graph, but if a value is available beyond the minimum value it should auto-scale.
There's no built-in way to do this.
You'll need to loop through your points before plotting, and if any of them exceed your minimum value, then don't provide the axis min.
it would be great to clarify how colors are calculated when ploting treemap (I use gvisTreeMap function from R googleVis library).
Documentation is not very informative. What is it meant by "The color value is first recomputed on a scale from minColorValue to maxColorValue"? Usually I use treemap to display sales (size) and sales difference (color). So ideally I would like to color rectangles so that I can distinguish positive from negative growth, which as I understand is not possible at the moment.
What bothers me most right now is that "... colors are valued relative to all other nodes in the graph". Is there any way to fix colors, so that sales difference, say -25 always gets the same color.
If I have understood your problem correctly, I believe the following will solve it:
Let's say your data is percentages, so can go from 0 to 100. Set minColorValue=-100 and maxColorValue=100
(Or if using a different range, just set it so that the min value is the negative of the max value so that the average is 0.)
Then, if you set the colours to, for example, minColor='red' and maxColor='green', this should solve part 1 (negative values will be displayed in red, and positive in green)
Also, it seems that setting maxColor and minColor fixes the average value the colors are calculated from, so that this also solves part 2 (that is, -25 will then always have the same color in the graph)
Color is computed as the average color value of all child nodes of a branch. A branch with no child nodes uses the color value from the DataTable. This color value is then scaled on the minColorValue to maxColorValue scale, and a color is computed between minColor and maxColor based on the scale.
Colors are not relative to other nodes on the graph - the size of the node is relative.