Комментарии:
Got a clear picture on Stream and different types of Stream and how it differs. Thank you !
ОтветитьAs usual, great video
ОтветитьVery clear explanation. I have one question here: create a default stream, ran Insert, Update and Delete DML operations on Source table. if we consume only Insert data from stream and commit, then rest of the changes also resetting. why?
Ответитьif you could share some knowledge on procedure and automating them using task considering one real case would definitely help full to all.
ОтветитьGreat work master Eagerly Waiting when next video is going to come.
ОтветитьVery well explained!!!
ОтветитьWOW.. Extraordinary Tutorial.. Can't find anywhere. Thanks for Sharing Knowledge
ОтветитьThanks for good lesson but does stream cost a lot is not been answered
Ответитьgreat work and great help.
ОтветитьExcellent Video!! Can you please add a video on streams on Views ( also secured views) and the various scenarios that will play out?
ОтветитьExcellent Sir...!!!! I am learning Snowflake from your video series.... I am the beginner to Snowflake but your teaching made me comfortable with Snowflake...
I wish to recall one of other comments that you are "Spiderman of Snowflake".... Kindly keep posting Sir...Many thanks for your efforts and TIME on this knowledge sharing.....
One request Sir it will be helpful if we get code for hands on.....
what is the use case of having multiple streams on a table
ОтветитьSo Remaining 5 chapter are in making is it?
ОтветитьThe link is unsecured Can you share an other link please
Ответитьi dont have the permission to access SQL scripts,why?
ОтветитьWhat is tale and type in stream result table.
ОтветитьSo there are 3 types of stream in Snowflake. Delta, append & insert only for external table
ОтветитьIt’s showing Stale after = 14days(default)
Can we change or set the stream stale_after date to a any particular date like example stale_after=1 day and stale_after= 2 years
very useful
Ответитьcan you please post all video text tutorials, only some of topics are available remaining all are videos there
ОтветитьWhy not use VARCHAR default length? if we use columns in the tables without defining any length and keeping it as MAX, snowflake confirms that there is no impact on the performance. do you see any impact to the downstream or any ETL tools which uses snowflake?
ОтветитьYour the hero we need, Sir!!!! Thank you for posting this. Your videos are very accentuate on point. The very Hands on training you provide differs you from the entire resources out there. Thank you for your time and knowledge in posting these videos.
ОтветитьBusiness problem I am struggling with: I have a table with some data which Is replaced by new data every week. Its in Json format so i structure it after loading into snowflake. The problem is I am unable to capture what changed when the new data is loaded and replaced by the old data. For example if i want to check what was the total savings till june and what is the savings now. I dont have that functionality because i dont have date to measure. its like a snapshot functionality i want to add in the table so everytime new data is loaded it captures the change .Please suggest some solutions around it.
ОтветитьExplained very well with excellent examples. Thank you.
ОтветитьThanks for the video, just one thing, I was getting error while trying to access Sql Link from description, is that restricted to a certain group? Cheers!
Ответитьyours is far better than udemy courses.
ОтветитьCould you please check the resource file link ? It not working
ОтветитьWhat is first-class object means?
ОтветитьThank you for sharing amazing content. I have a doubt, when S3 delta data flow through snowpipe in landing tables, files header also populates as values. However, we already set skip_header=1 while defining file format.
Why it is considering headers as one of values while loading data into landing tables
One of the BEST TUTORIAL series to learn Snowflake. This one is too good and better than any paid course. Could you please share SQLs/lab practice material in the description box for each lesson.
Ответитьhii plz provide the code which u r writing in snowflake terminal
Ответитьwhere is stream storage cost sir.
ОтветитьThat was a great video.. What if I have to implement the SCD Type 2 without stream and task on the table with 500 columns, how we can implement it?
ОтветитьThis is really awesome.
Ответитьhow to get this code for execution
ОтветитьHow to know stream status like data reading from base table or data reading completed from base table?
ОтветитьIf we create stream with show_intiail_rows=true then get_ddl not giving this option, do we get all this structure with any other options ?could you please reply on this ?
ОтветитьThis is awesome..
ОтветитьI was looking for a quick revision on Snowflake and these are one of the best tutorials I have ever seen. Thanks a ton.
ОтветитьHi Sir,
Thanks for the great tutorial on Streams. It was really very easy to follow your video. However, I am curious if we can create a stream to capture if the changes happen to some columns on a source table and not on all the columns.
Like for eg:
TEST_TABLE (COL1, COL2, COL3) contains 3 columns. I only care about the changes happen to COL1 and COL3, want to ignore for COL2.
So is it possible to create Stream this way
CREATE OR REPLACE STREAM TEST_STREAM
ON TABLE TEST_TABLE (COL1, COL3)
Or is there any alternative approach to achieve the above behaviour. Any help is appreciated. Thanks again for the great series.
For a delta stream, it seems it is only capturing the current version of the table and all are INSERT. Can you please help understand?
ОтветитьHi , i am new to snowflake and gone through the video ,had got one doubt -- Files are there in azure container and these files comes every day to this container , data to be copied to the snowflake (this wil be source table right ) continuously, My doubt is whatever the data is coming from the files(with DML operation on the same data ) everyday is simply loaded to the source table as new records or these DML operations applied on the source table ?
Ответитьtoo good
ОтветитьGold mine of knowledge...
Thank you for all your efforts Sir 🙏
This channel is probably amongst the best out there...😇
I hope it gets the appreciation in terms of number on yt that it deserves. 🤞👏
thank you for the wonderful playlist
ОтветитьNice explanation, keep going on. Thankyou..!
ОтветитьCurrently i am going through the videos and i am preparing for Snowflake Advance data engineering certification. Could you please guide me which playlist more towards data engineering course content. Where can i download those worksheet which you are doing in the videos.
ОтветитьIs it work with external table
ОтветитьHi I followed the same as per video but my stream has not captured the updated case it's always storing new update row
ОтветитьHere is a 56 year old lady learing Snowflake, and earned 2 snowflake badges, and preparing for SnowPro Certification, Thanks Data Engineering team for putting such amazing videos.
Ответить