rsouissi
Hi all,
I hope asking these many questions is not too much...
Question 1)
In the stock streaming demo we are trying to do, we need to push to selected users the real-time trade transactions executed for a particular stock item. I realize this requires important bandwidth and update frequency and my assumption for the demo is that there is no restrictive limitation in this regard.
However, I have some problems. The frequency was set to 20, bandwith to 40 and bufferSize to 100 but I can still see that the lightstreamer client is not receiving "enough" transaction updates and a lot many of them get dropped.
For instance, the feed can deliver 5 new transactions for a specific item in the same millisecond, but in total, it only delivers 100 new transactions in 1 second then it calms down. So technically, if lightstreamer can keep 100 updates in the buffer/history, it should be able to deliver the 100 updates to the client using the assigned client/server frequency, right ? So I should be able to receive the 100 updates in the client within the 100/20 = 5 seconds ? But in reality, I receive only few of them.
Any help in this regard ?
For info, the subscription is "MERGE" because of the issue discussed below.
Question 2)
For the trade transactions, I first used DISTINCT to get the updates, but this caused receiving a lot of NULL/UNCHANGED updates instead. For instance:
- The data adapter generates 100 "different" transaction events to lightstreamer (most of the fields are different, including the timestamp)
- Client subscribing to this item in DISTINCT mode receives:
[=====>TRADE] update for 1: [0001, null, 07:00:00:051, 77.500000, 120, 12780, 5186]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:057, UNCHANGED, 38, 12808, 6860]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:062, UNCHANGED, 20, 12837, 9005]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:068, UNCHANGED, 50, 12865, 10453]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:074, UNCHANGED, 10, 12896, 12126]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:079, UNCHANGED, 65, 12922, 13299]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:081, UNCHANGED, 20, 12929, 13659]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED]
... more items in the same form, all UNCHANGED
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED, UNCHANGED]
- Client subscribing to this item in MERGE mode receives:
[=====>TRADE] update for 1: [0001, null, 07:00:00:051, 77.500000, 190, 12782, 5390]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:057, UNCHANGED, 17, 12811, 6977]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:063, UNCHANGED, 65, 12840, 9250]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:069, UNCHANGED, 25, 12869, 10648]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:075, UNCHANGED, 20, 12900, 12286]
[=====>TRADE] update for 1: [UNCHANGED, UNCHANGED, 07:00:00:081, UNCHANGED, UNCHANGED, 12929, 13659]
The problem is that MERGE makes a lot of updates disapear. DISTINCT makes some disapear and many contain nothing (NULL/UNCHANGED)
The idea was to receive most of the updates, how can I achieve this ?
Question 3)
I need to deliver news feed thru the same stock streaming demo in the following way:
- Any time a user subscribes to the news, he should receive all the news item of the current day
- Then the user needs to receive the news updates in real-time.
How can I achieve this ?
Question 4)
Performance wise, which one is better ?
- Data adapter to filtrate all the redundant data (unchanged fields) and only send real updates to lightstreamer
- Data adapter to send everything to lightstreamer which will handle the unchanged data
Thanks in advance,
R
Dario Crivelli
Hi Riad
about your questions 1 and 2,
the Server sends "UNCHANGED" placeholders when field values are indeed unchanged across subsequent events (of course, the "UNCHANGED" notification is not sent in this form on the socket).
May you please confirm that all those "UNCHANGED" notifications that you receive correspond to what you expect, based on what you send from your Data Adapter?
Since you say that
@rsouissi, Wrote:
- The data adapter generates 100 "different" transaction events to lightstreamer (most of the fields are different, including the timestamp)
, this point has to be clarified, first.
You can check the content of the events received from your Data Adapter by setting as DEBUG the "LightstreamerLogger.subscriptions" category in the Server logging configuration file (i.e. lightstreamer_log_conf.xml).
If you sent sequences of events that are equal (with respect to the subscribed fields), then in MERGE mode those events would indeed be filtered, as it happened in your case.
@rsouissi, Wrote:
Question 3)
I need to deliver news feed thru the same stock streaming demo in the following way:
- Any time a user subscribes to the news, he should receive all the news item of the current day
- Then the user needs to receive the news updates in real-time.
How can I achieve this ?
You can take advantage of Lightstreamer snapshot management (introduced in DOCS-SDKs/General Concepts.pdf, paragraph 3.4).
If your news items are managed in DISTINCT mode, then the Server can keep a list of the last updates received, so that they can be sent to the clients at subscription, before the real time updates.
This is useful to fill fixed size scrolling windows with initial data. For your needs (i.e. keeping all the updates of the current day), this has two limitations:
- unlimited lists are not supported (you have to configure a limit for historical data lists)
- clearing of the list is not supported (if your Server is not restarted overnight, you need to use different item names for different days)
Those limitations can be overcome by managing items in COMMAND mode, but through some complication both on the Client and on the Data Adapter. Would you like to expand this matter?
@rsouissi, Wrote:
Question 4)
Performance wise, which one is better ?
- Data adapter to filtrate all the redundant data (unchanged fields) and only send real updates to lightstreamer
- Data adapter to send everything to lightstreamer which will handle the unchanged data
If some updates may be identical to the preceeding ones AND such updates are not useful for the clients, then filtering them in the Data Adapter may be more efficient then sending them.
This is because Lightstreamer Server performs the check for unchanged field values only at session level; this means that the same check may be repeated multiple times for the same update event, depending on the number of clients subscribing to the same item.
Dario