doubt about fan sites

Chefao

Member
Apr 30, 2017
45
27
I'm making a tool to see the latest habbo furniture like a fan site but I never can, how do they do it?

I've tried PHP=curl, PHP=file get contents, but it never works

PHP:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://www.habbo.com/gamedata/furnidata_xml/f6a4ed5273c975e70a74becfd411367de0d56240");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_USERAGENT,$_SERVER['HTTP_USER_AGENT']);
 

Bran

habcrush.pw
Mar 13, 2017
1,789
1,608
I'm making a tool to see the latest habbo furniture like a fan site but I never can, how do they do it?

I've tried PHP=curl, PHP=file get contents, but it never works

PHP:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://www.habbo.com/gamedata/furnidata_xml/f6a4ed5273c975e70a74becfd411367de0d56240");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_USERAGENT,$_SERVER['HTTP_USER_AGENT']);
certain fansites get given the icons of the furniture before release (Puhekupla do as i've spoke to the owner about it) for them to showcase a few months before they release
 

JayC

Always Learning
Aug 8, 2013
5,505
1,401
You could make a tool that crawls the furnidata file and finds any swfs you don't have in your directory and run the script every 24 hours
 

Chefao

Member
Apr 30, 2017
45
27
Yes I tried to do this but I do not know the right command for php, tried (preg match) but it does not work I do not know the reason, there is another problem that made me give up, I got to but the revision changes for example:




7857 is the revision id

Each furniture has a different id which complicated my code, I will leave this tool aside, thanks for the tips!
 

Higoka

Active Member
Dec 16, 2018
174
74
i think this is what you need:

you could make a cronjob that for example runs every 30 minutes and automatically download from habbo.
you can define from which domain it should download, for example com de nl ...

and with this in mind, you could even go a step further and run multiple cronjobs each downloading from a different domain.

as you see the possibilities are nearly endless. im also open for new ideas/features to implement, just let me know.
 
Last edited:

Users who are viewing this thread

Top