After I create a google-cloud-based hadoop-enable cluster, I want to change the default bucket to a different one, how can I do that? I can't find the answer in google cloud doscumentation. Thanks!
How can I change the default bucket of a google-cloud-based hadoop-enable cluster after I created it?
127 views Asked by Q. Zhang At
1
There are 1 answers
Related Questions in HADOOP
- Can anyoone help me with this problem while trying to install hadoop on ubuntu?
- Hadoop No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster)
- Top-N using Python, MapReduce
- Spark Driver vs MapReduce Driver on YARN
- ERROR: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "maprfs"
- can't write pyspark dataframe to parquet file on windows
- How to optimize writing to a large table in Hive/HDFS using Spark
- Can't replicate block xxx because the block file doesn't exist, or is not accessible
- HDFS too many bad blocks due to "Operation category WRITE is not supported in state standby" - Understanding why datanode can't find Active NameNode
- distcp throws java.io.IOException when copying files
- Hadoop MapReduce WordPairsCount produces inconsistent results
- If my data is not partitioned can that be why I’m getting maxResultSize error for my PySpark job?
- resource manager and nodemanager connectivity issues
- ERROR flume.SinkRunner: Unable to deliver event
- converting varchar(7) to decimal (7,5) in hive
Related Questions in GOOGLE-CLOUD-PLATFORM
- Why do I need to wait to reaccess to Firestore database even though it has already done before?
- Unable to call datastore using GCP service account key json
- Troubleshooting Airflow Task Failures: Slack Notification Timeout
- GoogleCloud Error: Not Found The requested URL was not found on this server
- Kubernetes cluster on GCE connection refused error
- Best way to upload images to Google Cloud Storage?
- Permission 'storage.buckets.get' denied on resource (or it may not exist)
- Google Datastream errors on larger MySQL tables
- Can anyone explain the output of apache-beam streaming pipeline with Fixed Window of 60 seconds?
- Parametrizing backend in terraform on gcp
- Nonsense error using a Python Google Cloud Function
- Unable to deploy to GAE from Github Actions
- Assigned A record for Subdomain in Cloud DNS to Compute Engine VM instance but not propagated/resolved yet
- Task failure in DataprocCreateClusterOperator when i add metadata
- How can I get the long running operation with google.api_core.operations_v1.AbstractOperationsClient
Related Questions in GOOGLE-HADOOP
- Hive external table location in google cloud storage is ignoring subdirectories
- How to speed up distcp when transferring data from Hadoop to Google Cloud Storage
- Google BigQuery Spark Connector: How to ignore unknown values on append
- Google Hadoop Filesystem Encryption
- (bdutil) Unable to get hadoop/spark cluster working with a fresh install
- Read from BigQuery into Spark in efficient way?
- How to manage conflicting DataProc Guava, Protobuf, and GRPC dependencies
- Google Cloud connector for Hadoop doesn't work with Pig
- GCS - Global Consistency with delete + rename
- Never successfully built a large hadoop&spark cluster
- GoogleHadoopFileSystemBase.setTimes() not working
- Accessing google cloud storage using hadoop FileSystem api
- How can I change the default bucket of a google-cloud-based hadoop-enable cluster after I created it?
- Hive INSERT OVERWRITE to Google Storage as LOCAL DIRECTORY not working
- Rate limit with Apache Spark GCS connector
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Did you create a cluster by hand, using bdutil, using Cloud Dataproc or through some other means?
bdutil
If you used bdutil, see the choose a default file system section in the setup documentation.
Cloud Dataproc
If you used Cloud Dataproc, you can access any bucket to which your project has permission by using the
gs://uri. If you want to connect your cluster to a new bucket for logs, you will have to create a new cluster, unfortunately.Other method
If you used a different method, like the "click to deploy" launcher, I recommend you give Dataproc or bdutil a try.