P2P Caching

Old, inactive threads
Moonman
Posts: 22
Joined: Mon Nov 12, 2007 8:06 pm
Location: Geraldton, WA

Post by Moonman » Thu Nov 15, 2007 9:51 pm

Someone could always start a website somewhere in the big wide web for Exetel users with NZB's (for usenet) and torrents that have already been ran through the cache.

Ofcourse, this isn't legal ;)

- aNt -
Posts: 76
Joined: Tue Jan 03, 2006 10:38 pm
Location: Western Sydney

Post by - aNt - » Fri Nov 16, 2007 10:55 pm

Moonman wrote:Ofcourse, this isn't legal ;)
Then you probably shouldn't be posting suggestions to it as it will never happen (not from Exetel anyway) ;).

Moonman
Posts: 22
Joined: Mon Nov 12, 2007 8:06 pm
Location: Geraldton, WA

Post by Moonman » Fri Nov 16, 2007 11:55 pm

Well, no one suggested anyone associated with the company need start it ;)

CoreyPlover
Volunteer Site Admin
Posts: 5922
Joined: Sat Nov 04, 2006 2:24 pm
Location: Melbourne, VIC

Post by CoreyPlover » Sat Nov 17, 2007 1:17 am

I'm still not sure whether people fully comprehend the difference between a P2P cache and a P2P mirror.

A mirror would provide the functionality that seems to be thoguht of in this thread: targetting of popular files, an exhaustive catalog of cached content, etc. But this new infrastructure is not a mirror, it is a dynamic cache.

My understanding is that the P2P caching system will act as a gateway between the end Exetel users and the wider internet. All packets that pass through it will contain a unique hash identifying them. The caching system will remember these hashes. Any time it detects that a particular hashed packet is becoming "popular" it will copy the data corresponding to that packet onto the 4Tb storage array. In this manner, the NEXT time it gets requested it will simply serve up the local copy resulting in faster speed and cheaper bandwidth.

Hence, there will never be any guarantee what content (at least, at a file level) will be in the cache at any time and there will never be any guarantee whether a particular torrent is 100% cached (since the caching works on a packet basis, not a file basis). Also it means that you will never be able to maintain a private tracker that always hits the cache since the cached content is highly dynamic (i.e. it will have to continually clear the unpopular hashes and data to make room for the more popular ones).

Attempting to "target" cached data doesn't make sense because there is no guarantee of any packet of your torrent residing in the cache at a given time. It is also not even possible, because the cache is a transparent gateway between you and your peers/seeds, not a separate IP that you can target as a proxy server.

You can only reap the benefits of the caching system indirectly, via popular torrents downloading faster (and possibly counting towards a separate download quota). It is mainly to reduce Exetel's running costs and so provide a more sustainable and viable service. This can then benefit all users because Exetel's cost per Gb quota is reduced and the savings can be returned to users in future plan pricing and peak / off-peak quotas.

Moonman
Posts: 22
Joined: Mon Nov 12, 2007 8:06 pm
Location: Geraldton, WA

Post by Moonman » Sat Nov 17, 2007 1:50 am

CoreyPlover wrote:You can only reap the benefits of the caching system indirectly, via popular torrents downloading faster (and possibly counting towards a separate download quota). It is mainly to reduce Exetel's running costs and so provide a more sustainable and viable service. This can then benefit all users because Exetel's cost per Gb quota is reduced and the savings can be returned to users in future plan pricing and peak / off-peak quotas.
Hence why I suggested a site with the torrents and nzb's that a user has already downloaded. That way people are downloading the same torrents, they are downloading the same nzbs. The cache will be hit.

CrackerJak
Posts: 37
Joined: Wed Nov 07, 2007 4:03 am
Location: NSW

Post by CrackerJak » Sat Nov 17, 2007 2:38 am

There has to be some sort of co-operation between excetel users otherwise hitting the cache could be like finding a needle in a haystack. Even popular material can have countless release groups providing identical content, finding the one thats cached is all just chance.
If excetel want the p2p cache system to reduce their network load by up to 30% this "chance" factor needs to be eliminated or severely reduced. If we were in agreement over what release groups to use that would be a good place to start.

dogwomble
Posts: 375
Joined: Sat Mar 17, 2007 12:21 am

Post by dogwomble » Sat Nov 17, 2007 7:00 am

tocpcs wrote:
That'd be fine. Wait until the whole file is cached.
That may not necessarily happen. While I would imagine the cache setup would have quite a substantial amount of disk space, given the way some users use P2P, I would imagine it would fill up pretty damn quickly. When this happens, the older information (particularly that which got relatively few hits) would be erased from the cache. Therefore, if you happen to attempt a download directly from the cache where only 1-2 users have downloaded it in the past, it is possible that the file will never complete without going back to the Internet at large.
The theory here is to stop all access except for cache access to save bandwidth.
It's a nice theory. It could work by setting the cache up as a proxy that you can define in your client. However, given the above problem, I'm not convinced it would be an effective solution unless we can find a way for the file to automatically start downloading from the internet at large - which incidentally is what would happen already, if I understand it correctly. Otherwise, too many manual processes for the user - which would in turn potentially increase the amount of support that would need to be provided to assist people in setting the client up and also reminding people to switch it off if their downloads aren't completing.

dogwomble
Posts: 375
Joined: Sat Mar 17, 2007 12:21 am

Post by dogwomble » Sat Nov 17, 2007 7:03 am

CrackerJak wrote:There has to be some sort of co-operation between excetel users otherwise hitting the cache could be like finding a needle in a haystack. Even popular material can have countless release groups providing identical content, finding the one thats cached is all just chance.
If excetel want the p2p cache system to reduce their network load by up to 30% this "chance" factor needs to be eliminated or severely reduced. If we were in agreement over what release groups to use that would be a good place to start.
That may not be as much of an issue as you think. I've actually noticed with some releases, the file sizes for a given file are exactly the same, it's just the names that are different. If the cache works on a signature system, then assuming the files are indeed absolutely identical, they will be treated as exactly the same file.

CLoSeR
Posts: 858
Joined: Fri Jun 11, 2004 1:49 pm
Location: West Ryde, NSW
Contact:

Post by CLoSeR » Sat Nov 17, 2007 8:28 am

CrackerJak wrote:There has to be some sort of co-operation between excetel users otherwise hitting the cache could be like finding a needle in a haystack. Even popular material can have countless release groups providing identical content, finding the one thats cached is all just chance.
If excetel want the p2p cache system to reduce their network load by up to 30% this "chance" factor needs to be eliminated or severely reduced. If we were in agreement over what release groups to use that would be a good place to start.
I dont think its this difficult, the technology will work it out im sure.
Need to log a fault ticket? Go here: https://helpdesk.exetel.com.au/

Spanner_Man

Post by Spanner_Man » Sat Nov 17, 2007 11:29 am

I think people misunderstood the whole idea of the cache system to what i perceve it to be.

Now as a perfect example back when there was JPC support within some p2p applications it would probe the ISP domain to see if there was a currently running cache system.

Then it would do the following (in basic terms)
1) Check to see if requested data chunk was located on the cache, if it was the retreve from the cache.
2) If 1 fail, retreve direct from outside ISP network "backbone" or the cache system would simply perform a data passthrough

Now with any cache system if the requested data chunk wasn't stored then the cache will passthrough the data until the requesting client has the chunk, then it would store that chunk for a preiod of time preset by requests/hour.

Now from what has been posted before that the cache system would automatically check to see if its p2p data or not. Now that takes up processing CPU time. Forusers that are advanced enough to manually configure their p2p client instead of CPU time being wasted the clients p2p application can directly connect through the cache.

Now for users that have zero knowledge then fair enough, they don't need to know how to configue their applications and what not to use the cache system and they can have the CPU time that advanceds users don't need to use.

Regardless of how many boxes or Boewoulf style clusters that will be installed, there will be a massive amount of CPU time involved with just the system trying to figure out of a particular data flow is p2p alone, and not VPN or SSH.

thomashouseman
Posts: 750
Joined: Thu Mar 18, 2004 12:06 pm
Location: Toongabbie
Contact:

Post by thomashouseman » Wed Nov 21, 2007 10:23 am

I've been reading up on this caching system...

Overall it sounds very good... cost savings for Exetel (indirectly passed onto us) and faster torrents for the endusers. Much better than the Allot system of just slowing down P2P.

I love how they're handling Encrypted torrents :P
PeerApp is addressing the challenge of P2P encryption in a couple of ways.

One is deployment of combined shaping/caching solution in conjunction with leading DPI vendors.

Such comprehensive one-stop solution for P2P traffic management allows ISP to throttle down the encrypted traffic,

while caching the unencrypted traffic at the same time.

Since the primary reason for P2P users to turn on encryption is to increase performance, such policy shall discourage

encryption usage.

Another direction is capturing the mindshare for P2P caching among ISPs and content owners.

As P2P caching increasing gains traction and ISPs continue to deploy and advertise their P2P caching solutions,

the usefulness of P2P encryption is going to fade away.

Spanner_Man

Post by Spanner_Man » Wed Nov 21, 2007 10:40 am

The second that encrypted traffic is throttled it will affect business critical services more then when it currently does now. eg VPN/RDP

And no this isn't speculation on my part, using the same ADSL modem, different provider, connected through the same exchange and there is a remarkable difference between VPN tunneled through Exetel and with other providers. (I get better SNR ratios and Line Att through Exetel then with the other provider)

Personally overall though Exetel is very attractively priced so i'll cope :)

thomashouseman
Posts: 750
Joined: Thu Mar 18, 2004 12:06 pm
Location: Toongabbie
Contact:

Post by thomashouseman » Wed Nov 21, 2007 10:49 am

Oh. I assumed they were only talking P2P encryption slowdown... See last line of the quote.

Spanner_Man

Post by Spanner_Man » Wed Nov 21, 2007 11:15 am

Yes i saw.
Although that comes down to the personal privacy vs supposed illegal usage of P2P tech.

I'm not going to go into detail about encrypting torrents or using encryption overall as there are many other forums, message boards and Wiki's that go into it, for both personal privacy and the supposed illegal activity.

PS i worded this post to make sure this post was not directed at anyone.

sh0nky
Posts: 143
Joined: Sat Sep 15, 2007 9:09 pm
Location: Bankstown/Cronulla

Post by sh0nky » Thu Nov 22, 2007 8:32 am

Any chance we can get a update from someone at Exetel in regards to how the P2P cache trials are going?..Just interested to learn what the initial results are.
Hard work never killed anyone ..but im not taking any chances :)

Locked