this post was submitted on 10 Dec 2023
52 points (98.1% liked)

Selfhosted

38746 readers
848 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I have several TB of borg backups. Uploaded them on backblaze b2. I could immediately see how much resources i was using, how many api calls, and so on. Very easy to see and predict the next bill. I can see exactly which bucket uses more resource, and which is growing over time.

Because I'm cheap, I want to upload those files on aws glacier, which theoretically costs a quarter of b2 for storage, but API calls are extremely expensive. So I want to know the details. I won't like to get a bill with $5 in storage and $500 in API calls.

Uploaded a backup, but nowhere in AWS I can see how much resources i am using, how much I'm going to pay, how many API calls, how much the user XYZ spent, and so on.

It looks like it's designed for an approach like "just use our product freely, don't worry about pricing, it's a problem for the financial department of your company".

In AWS console I found "s3 storage lens", but it says i need to delegate the access to someone else because reasons. Tried to create another user in my 1-user org, but after wasting 2 hours I wasn't able to find a way to add those permissions.

Tried to create a dashboard in "AWS cost explorer" but all the indicators are null or zero.

So, how can I see how many API calls and storage is used, to predict the final bill? Or the only way is to pray and wait the end of the month and hopefully there everything it's itemized in detail?

you are viewing a single comment's thread
view the rest of the comments
[–] fluckx@lemmy.world 1 points 8 months ago

If you create a new account you should have configured a root email address for it. That one should have received an email to login and set the initial password IIRC.

You can get an estimate of what it's going to cost by going to https://calculator.aws

Upload to AWS shouldn't really cost much, unless you're sending a lot of API put requests. Since they are backups I'm going to guess the files are large and will be uploaded as Multi-Part and will probably invoke multiple API calls to do the upload.

My suggestion would be to upload it to s3 and have it automatically transition to glacier for you using a lifecycle rule.

Cost explorer would be your best bet to get an idea of what it'll cost you at the end of the month as it can do a prediction. There is (unfortunately) not a way to see how many API requests you've already done IIRC.

Going by the s3 pricing page, PUT requests are $ 0.005 per 1000 requests( N. Virginia ).

Going by a docs example

For this example, assume that you are generating a multipart upload for a 100 GB file. In this case, you would have the following API calls for the entire process. There would be a total of 1002 API calls. 

https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html

Assuming you're uploading 10x 100gb according to the upload scheme mentioned above you'd make 10.020 API calls which would cost you 10 * 0.005= 0.05$.

Then there would be the storage cost on glacier itself and the 1 day storage on s3 before it transitioned to glacier.

Retrieving the data will also cost you, as well as downloading the retrieved data from s3 back to your device. If we're talking about a lot of small files you might incur some additional costs of the KMS key you used to encrypt the bucket.

I typed all this on my phone and it's not very practical to research like this. I don't think I'd be able to give you a 100% accurate answer if I was on my pc.

There's some hidden costs which aren't Hidden if you know they exist.

Note that (imo) AWS is mostly aimed at larger organisations and a lot of things ( like VMs ) are often cheaper elsewhere. It's the combination or everything AWS does and can do so that makes it worth the while. Once you have your data uploaded to s3 you should be able to see a decent estimate in cost explorer.

Note that extracting all that data back from s3 to your onprem or anywhere or you decide to leave AWS will cost you a lot more than what it cost you to put it there.

Hope this helps!