Hi Thomas,
We checked SMW01 and the related bDoc was actually "green" even though SMQ2 show an error in the Queue and then the subsequent bDocs entries in SMQ2 (presumably for the Document flow replication from ECC/R3) are not processed.
Brent
Hi Thomas,
We checked SMW01 and the related bDoc was actually "green" even though SMQ2 show an error in the Queue and then the subsequent bDocs entries in SMQ2 (presumably for the Document flow replication from ECC/R3) are not processed.
Brent
Hi Conall,
My issue is as below.
We addded new country JERSEY(JE) and its region in ECC including GEOCODING info for JE.Through DNL_BASIS* objects, I downloaded the data to CRM.
But in this process, GEOCODING info for the country JE were not downloaded and not able to find any suitable object to download it. Because of this, Cutsomers created with JE in ECC are failing in CRM with Bdoc error missing GEOCODING info for JE.
I need to maintain GEOCODING info for the country JE in CRM spro (Same as like ECC) and move across CRM QA/PRD or any report/ways are there to downlaod these GEOCODING data from ECC to CRM.
Regards
Suresh.
Was an solution ever found for this? I am having the exact same issue. If I create an inbound delivery and pack HU's that way the inspection lot creates. However, if I use COP1 and COWBHUWE the inspection lot does not create. My client is using a custom program that creates HU's with COP1 and COWBHUWE in the background.
Okay, choosing today's date (10/10/2014) Week (10/5/14 - Sunday)
Shift C is on Days
Shift A is on Afternoons
Shift B is on Nights
Next week (10/12/14 - Sunday)
Shift A is on Days
Shift B is on Afternoons
Shift C is on Nights
Following week (10/19/14 - Sunday)
Shift B is on Days
Shift C is on Afternoon
Shift A is on Nights
Hi Pavan,
We have several Java applications that interface with our BI4.1 SP2 environment via the Java SDK. Because the SDK code from our prior version, BI4 SP7 rollout, the apps did not need to recompile their code, BUT even before the upgrade we are seeing GLF files getting written to opt/IBMWebAs/profiles/profileServer/ on the JVM hosts whenever an issue is encountered between the SDK and the BI4 environment (report not available, bad param,, etc.).
Is there a way to change where the GLFs are being written or to suppress them on the hosts altogether?
Thanks,
TC
Hi
How do I know that a database was in Quiesce hold gor external dump when external copy was done?
is this written in the instance log somewhere when the server is started with -q option?
Regards Tomas
Hi Patrick
Check all your telegram is getting created and and you are getting conformation back to sap.
and use queue type resource with MFS.
check all structures used
Material Flow System (MFS) Telegram Processing Define Telegram Structure -> Define Telegram Structure for PLC
And
Material Flow System (MFS) Telegram Processing Define Telegram Structure -> Define Telegram Structure for PLC Interface Type
check all structure is working as expected.
Regards
Suraj
The business does not like the SAP standard report view and would like to know if it can be accessed using Lumira.
Would free hand SQL or Universe give me access to the report (ex. ME2M) or would that need a detailed configuration ? What will the Universe consume ?
I understand that Analytical views on HANA could be consumed by Lumira , please correct me if I am wrong.
Thanks
Thank you Erisa. I just dId it, I'll keep you all informed about the ticket.
Thank you again to everybody.
Malega,
if you search in this same SCN group for web dynpro and mashup keywords, you will find more info.
The basics are straightforward:
1. Get a URL of the transactions. Usually from the web dynpro application - which you may need an ABAP'er to help you do that.
2. You can create URL or HTML mashup and even pass parameters (aka port bindings). See the help guide. URL mashups are a bit sturdy since you get a new tab or window. If you create HTML mashups to embed in a facet, it can be a bit touchy.
Now be aware that your ECC URLs can generally only work in the VPN. If you have ideas on the iPad and it's outside of the VPN, it will not work.
Tim.
Buen día, Oscar.
Has intentado utilizar el depurador de datos maestros de SAP? Está en Gestión - Utilidades. Este solo te permitirá eliminar artículos que no estén registrados en documentos de SAP, y así te ahorras el query. Tienes que hacer un respaldo previo a la ejecución de este depurador.
La Auditoría de Stocks solo te mostrará artículos cuyo inventario que ha sido afectado, pero no aquellos que han sido p. ej. solicitados (orden de compra) o pedidos (orden de venta).
Espero te sea de utilidad.
Saludos,
JC.
Hi Tobias,
Do you have a screen shot of the system where it is not working?
From the screen shot above, it looks to be working as expected: by clicking on the Technical Information button in the pop-up you should get the Technical Information icon which you then just drag and drop to the are of the screen in question:
If this is not what happens when you click the Technical Information button, can you attach a screen shot of what does happen?
Kind regards,
Angela
Hi Chinmay,
We also have simillar kind of requirement. But I dont see index.html but we have only index.jsp. We are using Tomcat 7. Please let me know, where to do the changes.
Thanks,
Bilal
Post the complete output from the following:
=========================== submit to 'isql'
sp_helpdb tempdb
go
sp_helpdevice
go
set showplan on
set statistics io,time on
go
-- execute each proc twice; once to compile,
-- second time to re-use in-memory query plan
exec sp_1 -- does the basic select * from mytab
go
exec sp_1 -- does the basic select * from mytab
go
exec sp_2 -- does the insert/select against #tmptab
go
exec sp_2 -- does the insert/select against #tmptab
go
===========================
NOTE: Would recommend posting all output as a *txt attachment to maintain format and make easier to read.
First of all SQVI is user-specific, it can be used for your own ad-hoc reports only. To create the queries for others we need to use SQ01/SQ02. There is an option to convert SQVI, but it's just easier to get this done right from the beginning.
From what I see, if you want to link order/delivery/invoice information in one query then, unless all of those documents are expected to be present, you won't be able to easily do it in a query. For the delivery related billing you'd have to do LEFT JOIN to the delivery (because it may or may not exist) and also LEFT JOIN to get the invoice. At this point you'll likely run into an error that two consequent LEFT JOINs are not allowed. Add to that some fun with Ship-to that could be either at the line item or header level and it gets even more complicated.
Since we can add some ABAP code to the query, this is not impossible, but I'd guess it's not your area of expertise, so I'd ask for some assistance at this point.
Hi Leonardo - thanks - note is helpful - checking it further
Hello Chenyang,
Thank you for your feedback. Unfortunately, there is no setting for these flags on the IDP or SP side with the NW SSO release that we have currently. You can set up the authentication contexts with the assignment to the login modules, the default contexts and policies that contain the supported contexts on the IDP side. On the SP side, you can only request authentication context(s) or use the comparison method to request an exact match, better, etc on the IDP side.
I'm going to request an upgrade of our NW SSO server to SP3 because SAP has their own OTP login module in SP3 and they must have faced the same issue when they tried to make it work with SAML 2.0, assuming they have a working 2 level authentication SAML 2.0 set up in SP3.
Regards,
David Barna
Output could've been created manually - the requirements are not checked in this case. You'll see a manual flag in NAST table in such case.
And I remember one odd case (discussed here on SCN) when requirement involved something specific to the creation process but since the determination analysis was run in the change mode it showed not what the author was expecting. I.e. what you see in the first screenshot it's what is valid now, not when the output was created.
Overall I agree with Florian - why don't you just debug it?