Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Core Feature: Remote Storage Backends #24

Open
10 tasks
jpillora opened this issue Jan 10, 2016 · 16 comments
Open
10 tasks

Core Feature: Remote Storage Backends #24

jpillora opened this issue Jan 10, 2016 · 16 comments
Assignees
Milestone

Comments

@jpillora
Copy link
Owner

jpillora commented Jan 10, 2016

cloud-torrent is commonly run on cheap/free hosting providers which provide limited disk space and RAM though have access to large amounts of bandwidth. Considering this, it makes sense to support remote storage backends.

Required:

Optional:

@t-pankajkumar
Copy link

Hi jpillora,
I had written a java server application which takes the download url as input and uploads it to Dropbox of any size depends on the account storage, i had hosted it on Openshift.

please have a look at my code
https://github.com/t-pankajkumar/DropBoxJava/blob/master/DropBoxJava/src/com/drop/box/UrlUploadMain.java

i hope this will help you.
thank you :)

@joeblew99
Copy link

suggest you checkout minio on github. Its a S3 client in go, but also GS and its own parallel storage server.
it might be a better way to integrate with afero

@jpillora jpillora added this to the 0.9 milestone Mar 8, 2016
@besoeasy
Copy link

Maybe A Option To Move Downloaded Torrent's To A FTP Server

@jpillora
Copy link
Owner Author

@besoeasy ah yep, will add that and SCP/SFTP to the list (though note, it will stream to the remote server, so there won't be a local copy)

@t-pankajkumar
Copy link

Here is a small code snippet to auto upload torrents to Openload i wrote a node js with express
`var express = require('express');
var WebTorrent = require('webtorrent');
var path = require("path");
var app = express();
var client = new WebTorrent();
var request = require('request');
var openload_api = 'your openload api';
var openload_key = 'your openload key';
var oUrl = 'https://api.openload.co/1/remotedl/add?login='+openload_api+'&key='+openload_key+'&url=';
app.use(express.static(__dirname + '/public'));

// views is directory for all template files
app.set('views', __dirname + '/views');
app.set('view engine', 'ejs');

//this provides download link for downloaded files
app.get('/download', function(req, res){
var file = path.join(__dirname,'public', req.query.file);
res.download(file); // Set disposition and send it.
});

//to add torrent enter 'your_heroku_name.herokuapp.com/torAdd?magnet=magnet_link
app.get('/torAdd', function(req, res){
console.log('started');
client.add(req.query.magnet, { path: 'public' }, function (torrent) {
torrent.on('done', function () {
console.log('torrent download finished');
torrent.files.forEach(function(file){
console.log(file.name+' '+file.length+' '+file.path+'\n');
request(oUrl+"https://your_heroku_name.herokuapp.com/download?file="+file.path, function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body) // Show the HTML for the Google homepage.
}
});
});
});
});
res.send("downloading");
});`

@dubailive
Copy link

dubailive commented Nov 24, 2016

Hello, great script here. I created a Digital Ocean droplet and installed it works great. Just wondering if Cloud Torrent can be installed on Dedicated Server from a hosting company running Ubuntu 16.04 - advantage is to get much larger storage and unlimited bandwidth at cheaper price as well. instead of only using Digital Ocean or other cloud providers?

@sxml
Copy link

sxml commented Dec 2, 2016

T-pankajkumar How to add this code? #

@t-pankajkumar
Copy link

t-pankajkumar commented Dec 4, 2016

#sxml just create a heroku simple nodejs app and use that script, i have been downloading torrents via my own site hosted at heroku :)
If you need i will share my code to you.

@cinetube
Copy link

cinetube commented Dec 4, 2016

@t-pankajkumar - please share your torrent site code.

@ajthemacboy
Copy link

Google Drive, Amazon Cloud Drive, and Dropbox don't allow certain... sensitive media to be hosted on their platforms. Basic encryption or obfuscation may be useful, but I think it would require a web frontend. Maybe something to consider for a future version.

@Kingkai2009
Copy link

hello t-pankajkumar can you show how to do in heroku with your code it will be very helpful for me

@teun95
Copy link

teun95 commented May 19, 2018

Would it not make sense to allow rclone to be used for this? At least for the time being. It already supports most of the backends mentioned here and supports encryption. The possibility to move files when they are done downloading and run post-processing scripts might even make integrating this unnecessary.

@diman82
Copy link

diman82 commented Aug 1, 2018

@teun95 You miss the point. Think if you download a 20GB file - you'll have to wait for it to dl to a local storage, and then and only then you may be able to sync it to some cloud. But the purpose of this important feature, is to stream the contents "uploading on the fly".

@teun95
Copy link

teun95 commented Aug 3, 2018

@diman82 I was not aware of that. That would be a first as far as I know as I have not seen that before at commercial seedbox services. I know that for some storage providers, rclone is able to upload streaming as long as the files are downloaded sequentially. This is used when mounting online storage. I am not sure how usable that is though for cloud-torrent.

But what you propose sounds like a lot of work that will take a long time and will not be available for many different cloud storage providers, let alone with encryption. Though even in a limited form it sounds like a really cool feature. Perhaps it would additionally make sense to also make using rclone (or any other custom script) possible as part of post-processing? Let's not forget that the storage issue of cheap VPS servers might become less of an issue pretty rapidly with declining storage prices.

@diman82
Copy link

diman82 commented Aug 3, 2018

@teun95 You're right in your concept, but your request can be quite easily achieved with a shell script, and it doesn't directly relates to the current project.
Simply monitor a folder and rclone will pick up any changes and copy/move content to a pre-configured cloud source.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

13 participants