I have an Execute SQL Task that returns a Full Rowset from a SQL Server table and assigns it to a variable objRecs. I connect that to a foreach container with an ADO enumerator using objRecs variable and Rows in first table mode. I defined variables and mapped them to the columns.
I tested this by placing a Script task inside the foreach container and displaying the variables in a messagebox.
Now, for each row, I want to write a record to an MS Access table and then update a column back in the original SQL Server table where I retreived data in the Execute SQL task (i have the primary key). If I drop a Data Flow Task inside my foreach container, how do I pass the variables as input to an OLE DB Destination on the Data Flow?
Also, how would I update the original source table where source.id = objRects.id?
Thank you for your assistance. I have spent the day trying to figure this out (and thought it would be simple), but I am just not getting SSIS. Sorry if this has been covered.
Thanks,
Steve
smcneese wrote: I have an Execute SQL Task that returns a Full Rowset from a SQL Server table and assigns it to a variable objRecs. I connect that to a foreach container with an ADO enumerator using objRecs variable and Rows in first table mode. I defined variables and mapped them to the columns.
I tested this by placing a Script task inside the foreach container and displaying the variables in a messagebox.
Now, for each row, I want to write a record to an MS Access table and then update a column back in the original SQL Server table where I retreived data in the Execute SQL task (i have the primary key). If I drop a Data Flow Task inside my foreach container, how do I pass the variables as input to an OLE DB Destination on the Data Flow?
Also, how would I update the original source table where source.id = objRects.id?
Thank you for your assistance. I have spent the day trying to figure this out (and thought it would be simple), but I am just not getting SSIS. Sorry if this has been covered.
Thanks,
Steve
You can use the Derived Column Component to put variable values into the pipeline.
-Jamie
|||Use a multicast and an OLE DB Command to perform the update on the original table.|||
I looked at this, but the Derived Column Component requires an input also. How do I configure that?
Thanks,
Steve
|||smcneese wrote: I looked at this, but the Derived Column Component requires an input also. How do I configure that?
Thanks,Steve
You can't use Derived Column Component as a source adapter so in this case you'll have to use a script component.
I would question as to whether this is really what you want to do though. Executing a data-flow just to process one row of data with one column in it seems like the proverbial sledgehammer to crack a nut. What are you trying to achieve exactly?
-Jamie
|||
I question it myself. ;) I am trying to perform what I thought was a simple task, but I don't know DTS/SSIS well enough to know the correct approach. I spent all day trying to read articles to understand, but I am just not getting it.
I have a table is SQL Server 2005 that gets status records written to it for events that occur on wireless phones. When a record is written into this table, I need to write a record into a table of a legacy application running in Access. My original thought was to write a managed stored procedure in C# but someone suggested SSIS so I thought I would give it a try. I would think this to be a simple problem to solve and exactly what SSIS is for, but I cannot figure it out.
Can you or someone point me in the right direction? My current approach used an Execute SQL task to read all of the records that have not been processed. Then I used a Foreach container thinking I could take each row, write it to Access and then update that original source row to indicate it was processed. I would prefer not to do it row by row, but the source table is constantly being written to so that needs to be accounted for. Any help for your experts would be greatly appreciated.
Thanks,
Steve
|||smcneese wrote: I question it myself. ;) I am trying to perform what I thought was a simple task, but I don't know DTS/SSIS well enough to know the correct approach. I spent all day trying to read articles to understand, but I am just not getting it.
I have a table is SQL Server 2005 that gets status records written to it for events that occur on wireless phones. When a record is written into this table, I need to write a record into a table of a legacy application running in Access. My original thought was to write a managed stored procedure in C# but someone suggested SSIS so I thought I would give it a try. I would think this to be a simple problem to solve and exactly what SSIS is for, but I cannot figure it out.
Can you or someone point me in the right direction? My current approach used an Execute SQL task to read all of the records that have not been processed. Then I used a Foreach container thinking I could take each row, write it to Access and then update that original source row to indicate it was processed. I would prefer not to do it row by row, but the source table is constantly being written to so that needs to be accounted for. Any help for your experts would be greatly appreciated.
Thanks,
Steve
I don't really see a problem updating the table while it is being written to. Modern RDBMSs are supposed to be handle multiple concurrent connections to the same resource - that's kinda the point.
However, you say that when a record is written to the table in SQL Server 2005 you need to write it to the Access application. That is inherently a real-time operation and SSIS is a tool for batch operations. Given that 10000foot view of the problem I don't think SSIS is what you want. If you do NOT need to write to the access application at the same time as the record arrives then this becomes more of a batch operation and SSIS will be more suitable.
-Jamie
|||
I should have stated "near real time". Doing this batch every x minutes is fine. My point was about concurrent operation on the RDBMS. If I use and ADO .NET Input on the Data Flow with the appropriate SQL Statement to select my records I can then pass the results of that to the OLE ADO Ouput for Access. Now how can I use that same result set to got back and update my source table "processed" column so they don't get picked up again the next time the batch process runs? Can I split the output of the ADO.NET Input and use it as input to two other processes? Is that the way to do it?
Thanks for your help Jamie!
|||smcneese wrote: I should have stated "near real time". Doing this batch every x minutes is fine. My point was about concurrent operation on the RDBMS. If I use and ADO .NET Input on the Data Flow with the appropriate SQL Statement to select my records I can then pass the results of that to the OLE ADO Ouput for Access. Now how can I use that same result set to got back and update my source table "processed" column so they don't get picked up again the next time the batch process runs? Can I split the output of the ADO.NET Input and use it as input to two other processes? Is that the way to do it?
Thanks for your help Jamie!
Yes, that is the way to do it. And jwelch replied on this thread telling you exactly how.
-Jamie
|||
All,
Thank you for all of your help. Everything finally clicked on how this all works. Solution working fine, tested and in production.
Thanks,
Steve
|||smcneese wrote: All,
Thank you for all of your help. Everything finally clicked on how this all works. Solution working fine, tested and in production.
Thanks,
Steve
Good news. A happy customer for a change
No comments:
Post a Comment